Processing math: 100%
Math Home
Probability

Variance Measures Spread

The variance does not depend on the actual values of X or the average values of X. It only depends on the distance of the values of X to the expected values of X, or in other words how "spread out" the values are.




Example: Consider 3 random variables X, Y and Z defined by P(X=1)=0.5,P(X=1)=0.5P(Y=10)=0.5,P(Y=10)=0.5P(Z=9)=0.5,P(Z=11)=0.5 We will find the variance of all three variables using the formula Var(X)=E[X2]E[X]2.

First we find the expected value of each random variable: E[X]=10.5+(1)0.5=0E[Y]=100.5+(10)0.5=0E[Z]=90.5+110.5=10 Both X and Y are centered around 0 while Z is centered around 10.

Next we find the expected values of the squared variables. E[X2]=120.5+(1)20.5=1E[Y2]=1020.5+(10)20.5=100E[Z2]=920.5+1120.5=101 Finally, we compute the variances. Var(X)=102=1Var(Y)=10002=100Var(Z)=101102=1 Both X and Z have a variance of 1. Taking the square root, they also have a standard deviation of 1. This means they are on average 1 away from their mean, which is true because they are always 1 away from their mean. Notice that it doesn't matter that Z takes larger values than X.

On the other hand, the values of Y are much farther away from the mean. Taking the square root, the standard deviation of Y is 10, because Y is always 10 away from it's mean.

Variance and Linear Transformations

Adding a constant to a random variable X does not change how spread out the points of X are, so it does not change the variance.

Claim: Var(X+c)=Var(X) for any constant c.

Proof:
This follows from linearity of the exptected value. Var(X+c)=E[(X+c)2]E[X+c]2=E[X2+2cX+c2](E[X]+c)2=E[X2]+2cE[X]+c2(E[X]2+2cE[x]+c2)=E[X2]E[X]2=Var(X)




Multiplying X by a constant c changes the variance by a factor of c2.

Claim: Var(cX)=c2Var(X) for any constant c.

Proof:
This follows from linearity of the expected value. Var(cX)=E[(cX)2]E[cX]2=E[c2X2](cE[X])2=c2E[X2]c2E[X]2=c2(E[X2]E[X]2)=c2Var(X)




Corollary: If a and b are constants, then for any random variable X Var(aX+b)=a2Var(X)

Variance of Independent Random Variables

Claim: If X and Y are independent random variables, then Var(X+Y)=Var(X)+Var(Y)

Proof:
This follows from the expectation of independent random variables. Var(X+Y)=E[(X+Y)2]E[X+Y]2=E[(X+Y)2](E[X]+E[Y])2=E[X2+2XY+Y2]E[X]22E[X]E[Y]E[Y]2=E[X2]+2E[XY]+E[Y2]E[X]22E[X]E[Y]E[Y]2 Because X and Y are independent, E[XY]=E[X]E[Y]. So, 2E[XY]2E[X]E[Y]=0. This leaves E[X2]+E[Y2]E[X]2E[Y]2=E[X2]E[X]2+E[Y2]E[Y]2=Var(X)+Var(Y)

For the following questions, X and Y are independent random variables. The variance of X is 2 and the variance of Y is 4.

1. What is Var(3X+4)?




Unanswered

2. What is Var(YX)?




Unanswered

3. What is Var(2XY+3)?




Unanswered