Math Home

Claim: Let \(X\) and \(Y\) be random variables. Then \[\text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X, Y)\]

Proof:
Starting with the definition of \(\text{Var}(X+Y)\) and using properties of expectation, \begin{align} \text{Var}(X+Y) & = E[(X+Y)^2] - E[X+Y]^2 \\ & = E[X^2 + 2XY + Y^2] - (E[X]+E[Y])^2 \\ & = E[X^2] + 2E[XY] + E[Y^2] - E[X]^2 - 2E[X]E[Y] - E[Y]^2 \\ & = E[X^2] - E[X]^2 + E[Y^2] - E[Y]^2 + 2(E[XY] - E[X]E[Y]) \\ & = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X,Y) \end{align}



Corollary: If \(X\) and \(Y\) are independent random variables, then \[\text{Var}(X + Y) = \text{Var}(X)+\text{Var}(Y)\] This follows because \(\text{Cov}(X,Y)=0\) when \(X\) and \(Y\) are independent.

Claim: Let \(X_1, X_2, \dots, X_n\) be random variables. Then \[\text{Var}(X_1 + X_2 + \dots + X_n) = \sum_{i=1}^n \text{Var}(X_i) + 2\sum_{i=1}^{n-1} \sum_{j=i+1}^n \text{Cov}(X_i, X_j)\]

Proof:
The claim can be proved by induction using the case for \(2\) variables. Suppose the claim is true for \(X_1, X_2, \dots, X_{n-1},\) Then, by the claim for \(2\) variables, \begin{align} \text{Var}(X_1 & + X_2 + \dots + X_n) \\ & = \text{Var}(X_1 + \dots + X_{n-1}) + \text{Var}(X_n) + 2\text{Cov}(X_1 + \dots + X_{n-1}, X_n) \\ & = \text{Var}(X_1 + \dots + X_{n-1}) + \text{Var}(X_n) + 2\sum_{i=1}^{n-1} \text{Cov}(X_i, X_n) \end{align} where we are using \(X_1 + \dots + X_{n-1}\) as the first variable and \(X_n\) as the second variable. By assumption, \[\text{Var}(X_1 + \dots + X_{n-1}) = \sum_{i=1}^{n-1} \text{Var}(X_i) + 2\sum_{i=1}^{n-2} \sum_{j=i+1}^{n-1} \text{Cov}(X_i, X_j)\] If we group the variances, we get \[\sum_{i=1}^{n-1} \text{Var}(X_i) + \text{Var}(X_n) = \sum_{i=1}^n \text{Var}(X_i)\] Similarly, if we group the covariances, we get \[2\sum_{i=1}^{n-1} \text{Cov}(X_i, X_n) + 2\sum_{i=1}^{n-2} \sum_{j=i+1}^{n-1} \text{Cov}(X_i, X_j) = 2\sum_{i=1}^{n-1} \sum_{j=i+1}^n \text{Cov}(X_i, X_j)\] Combining the terms, we get \[\text{Var}(X_1 + X_2 + \dots + X_n) = \sum_{i=1}^n \text{Var}(X_i) + 2\sum_{i=1}^{n-1} \sum_{j=i+1}^n \text{Cov}(X_i, X_j)\] which is what we wanted to show.



Corollary: If \(X_1, X_2, \dots, X_n\) are independent random variables, then \[\text{Var}(X_1 + X_2 + \dots + X_n) = \text{Var}(X_1)+\text{Var}(X_2)+\dots+\text{Var}(X_n)\]

Example: Let \(X\) be a random variables with variance \(3\) and \(Y\) be a random variable with variance \(5.\) If \(X+Y\) has variance \(2,\) what is the covariance of \(X\) and \(Y.\)

Solution:
We are given \(\text{Var}(X) = 3,\) \(\text{Var}(Y) = 5,\) and \(\text{Var}(X+Y)=2.\) By the formula for variance of a sum of random variables, \begin{align} & \text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y) + 2\text{Cov}(X,Y) \Rightarrow \\ & 2 = 3 + 5 + 2\text{Cov}(X,Y) \Rightarrow \\ & -3 = \text{Cov}(X,Y) \end{align} The covariance of \(X\) and \(Y\) is \(-3.\)

Let \(X, Y\) and \(Z\) be random variables that satisfy the following: \begin{align} & \text{Var}(X) = 2\\ & \text{Var}(Y) = 3\\ & \text{Var}(Z) = 5\\ & \text{Cov}(X,Y) = -2\\ & \text{Cov}(X,Z) = 1\\ \end{align} The random variables \(Y\) and \(Z\) are independent.

1. Find \(\text{Var}(X+Y).\)




Unanswered

2. Find \(\text{Var}(Y+Z).\)




Unanswered

3. Find \(\text{Var}(X+Y+Z).\)




Unanswered

4. Find \(\text{Var}(2X-Y+Z).\)




Unanswered