Math Home

Sums of Independent Discrete Uniform Random Variables

Claim: Let \(X\) and \(Y\) be independent discrete uniform random variables on \(\{0, 1, 2, \dots, n\}\). Then the pmf of \(X+Y\) is \[p(m) = \begin{cases} \frac{m+1}{(n+1)^2} & \text{if } m \leq n \\ \frac{2n-m+1}{(n+1)^2} & \text{if } m > n \end{cases} \]

Proof:
Let \(m \in \{0, 1, 2, \dots, 2n\}.\) If \(X+Y = m,\) then \(X = i\) and \(Y = m-i\) for some \(i\) such that \(i\) and \(m-i\) are in \(\{0, 1, 2, \dots, n\}.\)

If \(m \leq n\) then \(X\) can take any value up to \(n\) so \(0 \leq i \leq m.\) Therefore, \begin{align} P(X+Y = m) & = \sum_{i=0}^m P(\{X = i\} \cap \{Y = m-i\}) \\ & = \sum_{i=0}^m P(X = i)P(Y = m-i) \\ & = \sum_{i=0}^m \frac{1}{(n+1)^2} \\ & = \frac{m+1}{(n+1)^2} \\ \end{align} If \(m > n\) then \(X\) must be large enough so that \(X+Y = m.\) So, \(m-n \leq i \leq n.\) Therefore, \begin{align} P(X+Y = m) & = \sum_{i=m-n}^n P(\{X = i\} \cap \{Y = m-i\}) \\ & = \sum_{i=m-n}^n P(X = i)P(Y = m-i) \\ & = \sum_{i=m-n}^n \frac{1}{(n+1)^2} \\ & = \frac{n-(m-n)+1}{(n+1)^2} \\ & = \frac{2n-m+1}{(n+1)^2} \end{align}

Sums of Independent Poisson Random Variables

Claim: Let \(X\) be a Poisson\((\lambda)\) random variable and \(Y\) be a Poisson\((\mu)\) random variable that is independent of \(X.\) Then \(X+Y\) has the Poisson\((\lambda+\mu)\) distribution.

Proof:
To show \(X+Y\) has the Poisson\((\lambda+\mu)\) distribution, we must show \(P(X+Y=k)= \frac{(\lambda+\mu)^k}{k!}e^{-(\lambda+\mu)}\) for any \(k \geq 0.\)

First, fix a value for \(k\) and rewrite the event \(\{X+Y=k\}\) as \[\{X+Y=k\} = \bigcup_{i=0}^k \{X=i\} \cap \{Y=k-i\}\] The equality is true because of \(X+Y=k\) then whatever value \(i\) that \(X\) is, the value of \(Y\) must be \(k-i.\) Since \(X\) and \(Y\) are nonnegative, both \(X\) and \(Y\) must be less than or equal to \(k\) when \(X+Y=k.\)

Now compute the probability. Since the events \((\{X=i\} \cap \{Y=k-i\} : 0 \leq i \leq k)\) are mutually exclusive, we can turn the probability of the union into the sum of the probabilities. \begin{align} P(X+Y=k) & = P\left(\bigcup_{i=0}^k \{X=i\} \cap \{Y=k-i\}\right) \\ & = \sum_{i=0}^k P(\{X=i\} \cap \{Y=k-i\}) \end{align} By the independence of \(X\) and \(Y,\) each term in the sum can be written as a product. Then the pmf's of \(X\) and \(Y\) can be used. \begin{align} \sum_{i=0}^k P(\{X=i\} \cap \{Y=k-i\}) & = \sum_{i=0}^k P(\{X=i\})P(\{Y=k-i\}) \\ & = \sum_{i=0}^k \frac{\lambda^i}{i!}e^{-\lambda}\frac{\mu^{k-i}}{(k-i)!}e^{-\mu} \\ & = e^{-(\lambda+\mu)}\sum_{i=0}^k \frac{1}{i!(k-i)!}\lambda^i \mu^{k-i} \\ \end{align} Next, multiply and divide by \(k!\) to use the Binomial Theorem. \begin{align} e^{-(\lambda+\mu)}\sum_{i=0}^k \frac{1}{i!(k-i)!}\lambda^i \mu^{k-i} & = \frac{e^{-(\lambda+\mu)}}{k!}\sum_{i=0}^k \frac{k!}{i!(k-i)!}\lambda^i \mu^{k-i} \\ & = \frac{e^{-(\lambda+\mu)}}{k!}\sum_{i=0}^k {k \choose i} \lambda^i \mu^{k-i} \\ & = \frac{e^{-(\lambda+\mu)}}{k!}(\lambda+\mu)^k \\ & = \frac{(\lambda+\mu)^k}{k!}e^{-(\lambda+\mu)} \\ \end{align}

Sums of Independent Normally Distributed Random Variables

Claim: Let \(X\) and \(Y\) be independent normally distributed random variables with means \(\mu_X\) and \(\mu_Y\) and variances \(\sigma_X^2\) and \(\sigma_Y^2.\) Then \(X+Y\) is normally distributed with mean \(\mu_X+\mu_Y\) and variance \(\sigma_X^2+\sigma_Y^2.\)

The proof that the sum of two independent, normally distributed random variables is normally distributed will not be shown here. There is later material that will make proving this fact easier than working through the convolution.

Check your understanding:

1. Let \(X\) and \(Y\) be independent discrete uniform random variables on \(\{0, 1, 2, 3, 4, 5\}.\) What is the probability \(4 \leq X + Y \leq 6?\)




Unanswered

2. Let \(X_1\) be Poisson\((1),\) \(X_2\) be Poisson\((3),\) and \(X_3\) be Poisson\((2.5).\) If \(X_1,\) \(X_2\) and \(X_3\) are independent, what is \(P(X_1 + X_2 + X_3 \leq 3)?\)




Unanswered

3. The random variables \(X\) and \(Y\) are independent. If \(X\) is normal with mean \(1\) and variance \(2,\) and \(Y\) is normal with mean \(-2\) and variance \(1,\) what is the probability that \(X + Y > 0?\)




Unanswered

4. Let \(X\) be Binomial\((5,0.2)\) and \(Y\) be Binomial\((4,0.2).\) Also, let \(X\) and \(Y\) be independent. Find \(P(X+Y \geq 5).\)




Unanswered