The variance does not depend on the actual values of \(X\) or the average values of \(X.\) It only depends on the distance of the values of \(X\) to the expected values of \(X,\) or in other words how "spread out" the values are.
Example: Consider 3 random variables \(X,\) \(Y\) and \(Z\) defined by
\begin{align}
& P(X = 1) = 0.5, P(X = -1) = 0.5 \\
& P(Y = 10) = 0.5, P(Y = -10) = 0.5 \\
& P(Z = 9) = 0.5, P(Z = 11) = 0.5 \\
\end{align}
We will find the variance of all three variables using the formula \(Var(X) = E[X^2] - E[X]^2.\)
First we find the expected value of each random variable:
\begin{align}
& E[X] = 1 \cdot 0.5 + (-1) \cdot 0.5 = 0 \\
& E[Y] = 10 \cdot 0.5 + (-10) \cdot 0.5 = 0 \\
& E[Z] = 9 \cdot 0.5 + 11 \cdot 0.5 = 10\\
\end{align}
Both \(X\) and \(Y\) are centered around \(0\) while \(Z\) is centered around 10.
Next we find the expected values of the squared variables.
\begin{align}
& E[X^2] = 1^2 \cdot 0.5 + (-1)^2 \cdot 0.5 = 1 \\
& E[Y^2] = 10^2 \cdot 0.5 + (-10)^2 \cdot 0.5 = 100 \\
& E[Z^2] = 9^2 \cdot 0.5 + 11^2 \cdot 0.5 = 101\\
\end{align}
Finally, we compute the variances.
\begin{align}
& \text{Var}(X) = 1 - 0^2 = 1 \\
& \text{Var}(Y) = 100 - 0^2 = 100 \\
& \text{Var}(Z) = 101 - 10^2 = 1\\
\end{align}
Both \(X\) and \(Z\) have a variance of \(1.\) Taking the square root, they also have a standard deviation of \(1.\) This means they are on average \(1\) away from their mean, which is true because they are always \(1\) away from their mean. Notice that it doesn't matter that \(Z\) takes larger values than \(X.\)
On the other hand, the values of \(Y\) are much farther away from the mean. Taking the square root, the standard deviation of \(Y\) is \(10,\) because \(Y\) is always \(10\) away from it's mean.
Adding a constant to a random variable \(X\) does not change how spread out the points of \(X\) are, so it does not change the variance.
Claim: \(\text{Var}(X + c) = \text{Var}(X)\) for any constant \(c.\)
Multiplying \(X\) by a constant \(c\) changes the variance by a factor of \(c^2.\)
Claim: \(\text{Var}(cX) = c^2\text{Var}(X)\) for any constant \(c.\)
Corollary: If \(a\) and \(b\) are constants, then for any random variable \(X\) \[\text{Var}(aX+b) = a^2\text{Var}(X)\]
Claim: If \(X\) and \(Y\) are independent random variables, then \[\text{Var}(X+Y) = \text{Var}(X) + \text{Var}(Y)\]
For the following questions, \(X\) and \(Y\) are independent random variables. The variance of \(X\) is \(2\) and the variance of \(Y\) is \(4.\)
1. What is \(\text{Var}(-3X+4)?\)
Unanswered
2. What is \(\text{Var}(Y-X)?\)
Unanswered
3. What is \(\text{Var}(2X-Y+3)?\)
Unanswered