The variance does not depend on the actual values of X or the average values of X. It only depends on the distance of the values of X to the expected values of X, or in other words how "spread out" the values are.
Example: Consider 3 random variables X, Y and Z defined by
P(X=1)=0.5,P(X=−1)=0.5P(Y=10)=0.5,P(Y=−10)=0.5P(Z=9)=0.5,P(Z=11)=0.5
We will find the variance of all three variables using the formula Var(X)=E[X2]−E[X]2.
First we find the expected value of each random variable:
E[X]=1⋅0.5+(−1)⋅0.5=0E[Y]=10⋅0.5+(−10)⋅0.5=0E[Z]=9⋅0.5+11⋅0.5=10
Both X and Y are centered around 0 while Z is centered around 10.
Next we find the expected values of the squared variables.
E[X2]=12⋅0.5+(−1)2⋅0.5=1E[Y2]=102⋅0.5+(−10)2⋅0.5=100E[Z2]=92⋅0.5+112⋅0.5=101
Finally, we compute the variances.
Var(X)=1−02=1Var(Y)=100−02=100Var(Z)=101−102=1
Both X and Z have a variance of 1. Taking the square root, they also have a standard deviation of 1. This means they are on average 1 away from their mean, which is true because they are always 1 away from their mean. Notice that it doesn't matter that Z takes larger values than X.
On the other hand, the values of Y are much farther away from the mean. Taking the square root, the standard deviation of Y is 10, because Y is always 10 away from it's mean.
Adding a constant to a random variable X does not change how spread out the points of X are, so it does not change the variance.
Claim: Var(X+c)=Var(X) for any constant c.
Multiplying X by a constant c changes the variance by a factor of c2.
Claim: Var(cX)=c2Var(X) for any constant c.
Corollary: If a and b are constants, then for any random variable X Var(aX+b)=a2Var(X)
Claim: If X and Y are independent random variables, then Var(X+Y)=Var(X)+Var(Y)
For the following questions, X and Y are independent random variables. The variance of X is 2 and the variance of Y is 4.
1. What is Var(−3X+4)?
Unanswered
2. What is Var(Y−X)?
Unanswered
3. What is Var(2X−Y+3)?
Unanswered