Sums of Independent Discrete Uniform Random Variables
Claim: Let \(X\) and \(Y\) be independent discrete uniform random variables on \(\{0, 1, 2, \dots, n\}\). Then the pmf of \(X+Y\) is
\[p(m) =
\begin{cases}
\frac{m+1}{(n+1)^2} & \text{if } m \leq n \\
\frac{2n-m+1}{(n+1)^2} & \text{if } m > n
\end{cases}
\]
▼ Proof:
Let \(m \in \{0, 1, 2, \dots, 2n\}.\) If \(X+Y = m,\) then \(X = i\) and \(Y = m-i\) for some \(i\) such that \(i\) and \(m-i\) are in \(\{0, 1, 2, \dots, n\}.\)
If \(m \leq n\) then \(X\) can take any value up to \(n\) so \(0 \leq i \leq m.\) Therefore,
\begin{align}
P(X+Y = m) & = \sum_{i=0}^m P(\{X = i\} \cap \{Y = m-i\}) \\
& = \sum_{i=0}^m P(X = i)P(Y = m-i) \\
& = \sum_{i=0}^m \frac{1}{(n+1)^2} \\
& = \frac{m+1}{(n+1)^2} \\
\end{align}
If \(m > n\) then \(X\) must be large enough so that \(X+Y = m.\) So, \(m-n \leq i \leq n.\) Therefore,
\begin{align}
P(X+Y = m) & = \sum_{i=m-n}^n P(\{X = i\} \cap \{Y = m-i\}) \\
& = \sum_{i=m-n}^n P(X = i)P(Y = m-i) \\
& = \sum_{i=m-n}^n \frac{1}{(n+1)^2} \\
& = \frac{n-(m-n)+1}{(n+1)^2} \\
& = \frac{2n-m+1}{(n+1)^2}
\end{align}
Sums of Independent Poisson Random Variables
Claim: Let \(X\) be a Poisson\((\lambda)\) random variable and \(Y\) be a Poisson\((\mu)\) random variable that is independent of \(X.\) Then \(X+Y\) has the Poisson\((\lambda+\mu)\) distribution.
▼ Proof:
To show \(X+Y\) has the Poisson\((\lambda+\mu)\) distribution, we must show \(P(X+Y=k)= \frac{(\lambda+\mu)^k}{k!}e^{-(\lambda+\mu)}\) for any \(k \geq 0.\)
First, fix a value for \(k\) and rewrite the event \(\{X+Y=k\}\) as
\[\{X+Y=k\} = \bigcup_{i=0}^k \{X=i\} \cap \{Y=k-i\}\]
The equality is true because of \(X+Y=k\) then whatever value \(i\) that \(X\) is, the value of \(Y\) must be \(k-i.\) Since \(X\) and \(Y\) are nonnegative, both \(X\) and \(Y\) must be less than or equal to \(k\) when \(X+Y=k.\)
Now compute the probability. Since the events \((\{X=i\} \cap \{Y=k-i\} : 0 \leq i \leq k)\) are mutually exclusive, we can turn the probability of the union into the sum of the probabilities.
\begin{align}
P(X+Y=k) & = P\left(\bigcup_{i=0}^k \{X=i\} \cap \{Y=k-i\}\right) \\
& = \sum_{i=0}^k P(\{X=i\} \cap \{Y=k-i\})
\end{align}
By the independence of \(X\) and \(Y,\) each term in the sum can be written as a product. Then the pmf's of \(X\) and \(Y\) can be used.
\begin{align}
\sum_{i=0}^k P(\{X=i\} \cap \{Y=k-i\}) & = \sum_{i=0}^k P(\{X=i\})P(\{Y=k-i\}) \\
& = \sum_{i=0}^k \frac{\lambda^i}{i!}e^{-\lambda}\frac{\mu^{k-i}}{(k-i)!}e^{-\mu} \\
& = e^{-(\lambda+\mu)}\sum_{i=0}^k \frac{1}{i!(k-i)!}\lambda^i \mu^{k-i} \\
\end{align}
Next, multiply and divide by \(k!\) to use the Binomial Theorem.
\begin{align}
e^{-(\lambda+\mu)}\sum_{i=0}^k \frac{1}{i!(k-i)!}\lambda^i \mu^{k-i} & = \frac{e^{-(\lambda+\mu)}}{k!}\sum_{i=0}^k \frac{k!}{i!(k-i)!}\lambda^i \mu^{k-i} \\
& = \frac{e^{-(\lambda+\mu)}}{k!}\sum_{i=0}^k {k \choose i} \lambda^i \mu^{k-i} \\
& = \frac{e^{-(\lambda+\mu)}}{k!}(\lambda+\mu)^k \\
& = \frac{(\lambda+\mu)^k}{k!}e^{-(\lambda+\mu)} \\
\end{align}
Sums of Independent Normally Distributed Random Variables
Claim: Let \(X\) and \(Y\) be independent normally distributed random variables with means \(\mu_X\) and \(\mu_Y\) and variances \(\sigma_X^2\) and \(\sigma_Y^2.\) Then \(X+Y\) is normally distributed with mean \(\mu_X+\mu_Y\) and variance \(\sigma_X^2+\sigma_Y^2.\)
The proof that the sum of two independent, normally distributed random variables is normally distributed will not be shown here. There is later material that will make proving this fact easier than working through the convolution.
Check your understanding:
1. Let \(X\) and \(Y\) be independent discrete uniform random variables on \(\{0, 1, 2, 3, 4, 5\}.\) What is the probability \(4 \leq X + Y \leq 6?\)
Unanswered
Solution: Since \(X\) and \(Y\) are indpendent, discrete uniform random variables over \(\{0, 1, 2, 3, 4, 5\},\) their pmf is
\[p(m) =
\begin{cases}
\frac{m+1}{36} & \text{if } m \leq 5 \\
\frac{11-m}{36} & \text{if } m > 5
\end{cases}
\]
Using this formula,
\begin{align}
P(4 \leq X + Y \leq 6) & = P(X+Y = 4) + P(X+Y = 5) + P(X+Y = 6) \\
& = \frac{5}{36}+\frac{6}{36}+\frac{5}{36} \\
& \approx 0.44
\end{align}
2. Let \(X_1\) be Poisson\((1),\) \(X_2\) be Poisson\((3),\) and \(X_3\) be Poisson\((2.5).\) If \(X_1,\) \(X_2\) and \(X_3\) are independent, what is \(P(X_1 + X_2 + X_3 \leq 3)?\)
Unanswered
Solution: Since \(X_1, X_2,\) and \(X_3\) are independent Poisson random variables, \(X_1+X_2+X_3\) is Poisson with parameter \(1+3+2.5 = 6.5.\) So, if we let \(p\) be the pmf of \(X_1+X_2+X_3,\)
\begin{align}
P(X_1 + X_2 + X_3 \leq 3) & = p(0) + p(1) + p(2) + p(3) \\
& = (1+6.5+\frac{6.5^2}{2}+\frac{6.5^3}{6})e^{-6.5} \\
& \approx 0.11
\end{align}
3. The random variables \(X\) and \(Y\) are independent. If \(X\) is normal with mean \(1\) and variance \(2,\) and \(Y\) is normal with mean \(-2\) and variance \(1,\) what is the probability that \(X + Y > 0?\)
Unanswered
Solution: Since \(X\) and \(Y\) are independent normal random variables, \(X+Y\) is normally distributed with mean \(1-2 = -1\) and variance \(2+1 = 3.\) So, letting \(Z\) be a standard normal random variable,
\begin{align}
P(X+Y > 0) & = P\left(\frac{X+Y-1}{\sqrt{3}} > \frac{-1}{\sqrt{3}}\right) \\
& = P\left(Z > \frac{-1}{\sqrt{3}}\right) \\
& = P\left(Z < \frac{1}{\sqrt{3}}\right) \\
& \approx P(Z < 0.57) \\
& \approx 0.72
\end{align}
4. Let \(X\) be Binomial\((5,0.2)\) and \(Y\) be Binomial\((4,0.2).\) Also, let \(X\) and \(Y\) be independent. Find \(P(X+Y \geq 5).\)
Unanswered
Solution: Since \(X\) is Binomial\((5,0.2),\) \(X\) has the same distribution as \(\sum_{i=1}^5 Y_i\) where \(Y_1, Y_2, \dots, Y_5\) are independent Bernoulli random variables with parameter \(0.2.\) Similarly, \(Y\) has the same distribution as \(\sum_{i=6}^9 Y_i\) where \(Y_6, Y_7, Y_8, Y_9\) independent Bernoulli random variables with parameter \(0.2.\) Since \(X\) and \(Y\) are independent, we can choose \(Y_1, Y_2, \dots, Y_9\) that are all independent. So, \(X+Y\) has the same distribution as
\[\sum_{i=1}^5 Y_i+\sum_{i=6}^9 Y_i=\sum_{i=1}^9 Y_i\]
Since the paramter for all the Bernoulli random variables is \(0.2,\) the sum has the same distribution as a Binomial\((9,0.2)\) random variable. In summary, \(X+Y \sim \text{Binomial}(5,0.2).\) So, if we let \(p(i)\) be the pmf of \(X+Y,\)
\begin{align}
P(X+Y \geq 5) & = 1 - P(X+Y < 5) \\
& = 1 - p(0)+p(1)+p(2)+p(3)+p(4) \\
& = 1 - \sum_{i=0}^4 {9 \choose i}(0.2)^i(0.8)^{9-i}\\
& \approx 0.02
\end{align}