Math Home

Variance Measures Spread

The variance does not depend on the actual values of \(X\) or the average values of \(X.\) It only depends on the distance of the values of \(X\) to the expected values of \(X,\) or in other words how "spread out" the values are.




Example: Consider 3 random variables \(X,\) \(Y\) and \(Z\) defined by \begin{align} & P(X = 1) = 0.5, P(X = -1) = 0.5 \\ & P(Y = 10) = 0.5, P(Y = -10) = 0.5 \\ & P(Z = 9) = 0.5, P(Z = 11) = 0.5 \\ \end{align} We will find the variance of all three variables using the formula \(Var(X) = E[X^2] - E[X]^2.\)

First we find the expected value of each random variable: \begin{align} & E[X] = 1 \cdot 0.5 + (-1) \cdot 0.5 = 0 \\ & E[Y] = 10 \cdot 0.5 + (-10) \cdot 0.5 = 0 \\ & E[Z] = 9 \cdot 0.5 + 11 \cdot 0.5 = 10\\ \end{align} Both \(X\) and \(Y\) are centered around \(0\) while \(Z\) is centered around 10.

Next we find the expected values of the squared variables. \begin{align} & E[X^2] = 1^2 \cdot 0.5 + (-1)^2 \cdot 0.5 = 1 \\ & E[Y^2] = 10^2 \cdot 0.5 + (-10)^2 \cdot 0.5 = 100 \\ & E[Z^2] = 9^2 \cdot 0.5 + 11^2 \cdot 0.5 = 101\\ \end{align} Finally, we compute the variances. \begin{align} & \text{Var}(X) = 1 - 0^2 = 1 \\ & \text{Var}(Y) = 100 - 0^2 = 100 \\ & \text{Var}(Z) = 101 - 10^2 = 1\\ \end{align} Both \(X\) and \(Z\) have a variance of \(1.\) Taking the square root, they also have a standard deviation of \(1.\) This means they are on average \(1\) away from their mean, which is true because they are always \(1\) away from their mean. Notice that it doesn't matter that \(Z\) takes larger values than \(X.\)

On the other hand, the values of \(Y\) are much farther away from the mean. Taking the square root, the standard deviation of \(Y\) is \(10,\) because \(Y\) is always \(10\) away from it's mean.

Variance and Linear Transformations

Adding a constant to a random variable \(X\) does not change how spread out the points of \(X\) are, so it does not change the variance.

Claim: \(\text{Var}(X + c) = \text{Var}(X)\) for any constant \(c.\)

Proof:
This follows from linearity of the exptected value. \begin{align} \text{Var}(X+c) & = E[(X+c)^2] - E[X+c]^2 \\ & = E[X^2 + 2cX + c^2] - (E[X]+c)^2 \\ & = E[X^2] + 2cE[X] + c^2 - (E[X]^2 + 2cE[x] + c^2) \\ & = E[X^2] - E[X]^2 \\ & = \text{Var}(X) \end{align}




Multiplying \(X\) by a constant \(c\) changes the variance by a factor of \(c^2.\)

Claim: \(\text{Var}(cX) = c^2\text{Var}(X)\) for any constant \(c.\)

Proof:
This follows from linearity of the expected value. \begin{align} \text{Var}(cX) & = E[(cX)^2] - E[cX]^2 \\ & = E[c^2X^2] - (cE[X])^2 \\ & = c^2E[X^2] - c^2E[X]^2 \\ & = c^2(E[X^2] - E[X]^2) \\ & = c^2\text{Var}(X) \end{align}




Corollary: If \(a\) and \(b\) are constants, then for any random variable \(X\) \[\text{Var}(aX+b) = a^2\text{Var}(X)\]

Variance of Independent Random Variables

Claim: If \(X\) and \(Y\) are independent random variables, then \[\text{Var}(X+Y) = \text{Var}(X) + \text{Var}(Y)\]

Proof:
This follows from the expectation of independent random variables. \begin{align} \text{Var}(X + Y) & = E[(X+Y)^2] - E[X+Y]^2 \\ & = E[(X+Y)^2] - (E[X]+E[Y])^2 \\ & = E[X^2 + 2XY + Y^2] - E[X]^2 - 2E[X]E[Y] - E[Y]^2 \\ & = E[X^2] + 2E[XY] + E[Y^2] - E[X]^2 - 2E[X]E[Y] - E[Y]^2 \end{align} Because \(X\) and \(Y\) are independent, \(E[XY] = E[X]E[Y].\) So, \(2E[XY] - 2E[X]E[Y] = 0.\) This leaves \begin{align} E[X^2] + E[Y^2] - E[X]^2 - E[Y]^2 & = E[X^2] - E[X]^2 + E[Y^2] - E[Y]^2 \\ & = \text{Var}(X) + \text{Var}(Y) \end{align}

For the following questions, \(X\) and \(Y\) are independent random variables. The variance of \(X\) is \(2\) and the variance of \(Y\) is \(4.\)

1. What is \(\text{Var}(-3X+4)?\)




Unanswered

2. What is \(\text{Var}(Y-X)?\)




Unanswered

3. What is \(\text{Var}(2X-Y+3)?\)




Unanswered