Math Home

Definition of Moments

The \(n^{th}\) moment of a random variable \(X\) is \(E[X^n].\)

The first moment of \(X\) is the mean, \(E[X].\) The second moment of \(X\) is \(E[X^2].\) The third moment of \(X\) is \(E[X^3],\) and so on.

Definition of the Moment Generating Function

The moment generating function or MGF of a random variable \(X\) is the function \(M(s)\) defined by \[M(s)=E[e^{sX}]\]


The reason \(M(s)\) is called the moment generating function is that all of the moments can be computed from \(M(s)\) by taking derivatives. \[\left.\frac{d^n}{ds^n}M(s)\right|_{s = 0} = E[X^n]\]

This result follows by induction. For \(n = 1,\) \begin{align} \frac{d}{ds}M(s) & = \frac{d}{ds}E[e^{sX}] \\ & = E[\frac{d}{ds} e^{sX}] \\ & = E[X e^{sX}] \end{align} Plugging in \(s = 0\) we get \(\left.\frac{d}{ds}M(s)\right|_{s = 0} = E[X].\)

Taking a second derivative, we get \begin{align} \frac{d^2}{ds^2}M(s) & = \frac{d}{ds}E[Xe^{sX}] \\ & = E[\frac{d}{ds} Xe^{sX}] \\ & = E[X^2 e^{sX}] \end{align} Plugging in \(s = 0\) we get \(\left.\frac{d^2}{ds^2}M(s)\right|_{s = 0} = E[X^2].\)

Assuming \(\frac{d^n}{ds^n}M(s) = E[X^n e^{sX}],\) we get \begin{align} \frac{d^{n+1}}{ds^{n+1}}M(s) & = \frac{d}{ds}E[X^n e^{sX}] \\ & = E[\frac{d}{ds} X^n e^{sX}] \\ & = E[X^{n+1} e^{sX}] \end{align} Therefore, \(\left.\frac{d^{n+1}}{ds^{n+1}}M(s)\right|_{s = 0} = E[X^{n+1}].\)

Moment Generating Functions of Common Distributions

We find the MGF of several common distributions. We frequently make use of the fact that for discrete random variables \[E[g(X)] = \sum p(i)g(i)\] and for continuous random variables \[E[g(X)] = \int f(x)g(x)dx\]


If \(X\) is Bernoulli\((p),\) the MGF of \(X\) is \[M(s) = 1-p+pe^s\]

Proof:
By direct computation, \[E[e^{sX}] = (1-p)e^{0s}+pe^{1s} = 1-p+pe^s\]


If \(X\) is Geometric\((p),\) the MGF of \(X\) is \[M(s) = \frac{pe^s}{1-(1-p)e^s}\] for all \(s < -ln(1-p).\)

Proof:
By direct computation, \begin{align} E[e^{sX}] & = \sum_{i=1}^\infty p(1-p)^{i-1}e^{is} \\ & = \frac{p}{1-p} \sum_{i=1}^\infty ((1-p)e^s)^i \\ \end{align} The series converges if \((1-p)e^s < 1.\) Solving for \(s,\) the series converges if \(s < -ln(1-p).\) When \(s < -ln(1-p),\) the series converges to \(\frac{r}{1-r}\) where \(r = (1-p)e^s.\) \begin{align} \frac{p}{1-p} \sum_{i=1}^\infty ((1-p)e^s)^i & = \frac{p}{1-p} \left[\frac{(1-p)e^s}{1-(1-p)e^s}\right] \\ & = \frac{pe^s}{1-(1-p)e^s} \end{align}


If \(X\) is Binomial\((n,p),\) the MGF of \(X\) is \[M(s) = (1-p+pe^s)^n\]

Proof:
Let \(Y_1, Y_2, \dots, Y_n\) be independent random variables with Bernoulli\((p)\) distribution. Then \(\sum_{i=1}^n Y_i\) has the Binomial\((n,p)\) distribution. Therefore, \begin{align} E[e^{sX}] & = E[e^{s\sum_{i=1}^n Y_i}] \\ & = E\left[\prod_{i=1}^n e^{sY_i}\right] \\ & = E[e^{sY_1}]^n \end{align} where the last line follows because \(Y_1, Y_2, \dots, Y_n\) are independent and identically distributed.

The MGF of a Bernoulli\((p)\) random variable is shown above to be \(1-p+pe^s.\) Therefore, \[E[e^{sY_1}]^n = (1-p+pe^s)^n\]


If \(X\) is Poisson\((\lambda),\) the MGF of \(X\) is \[M(s) = e^{\lambda(e^s-1)}\]

Proof:
By direct computation, \begin{align} E[e^{sX}] & = \sum_{i=0}^\infty \frac{\lambda^i}{i!}e^{-\lambda}e^{is} \\ & = e^{-\lambda}\sum_{i=0}^\infty \frac{(\lambda e^s)^i}{i!} \\ & = e^{-\lambda}e^{\lambda e^s} \\ & = e^{\lambda (e^s-1)} \end{align}


If \(X\) is a uniform continuous random variable over \((a, b),\) the MGF of \(X\) is \[M(s) = \frac{e^{sb}-e^{sa}}{s(b-a)}\]

Proof:
By direct computation, \begin{align} E[e^{sX}] & = \int_a^b \frac{1}{b-a} e^{sx} dx \\ & = \left. \frac{e^{sx}}{s(b-a)} \right|_a^b \\ & = \frac{e^{sb}-e^{sa}}{s(b-a)} \end{align}


If \(X\) is a Normal\((\mu, \sigma^2)\) random variable, the MGF of \(X\) is \[M(s) = e^{s\mu+\frac{1}{2}\sigma^2s^2}\]

Proof:
By direct computation, \begin{align} E[e^{sX}] & = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}e^{sx}dx \\ & = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}+sx}dx \end{align} Next, we rearrange the terms in the exponent. \begin{align} -\frac{(x-\mu)^2}{2\sigma^2}+sx & = -\frac{x^2-2\mu x+\mu^2 - 2\sigma^2 sx}{2\sigma^2} \\ & = -\frac{(x-(\mu+\sigma^2 s))^2 - (\mu+\sigma^2 s)^2 + \mu^2}{2\sigma^2} \end{align} where the last line follows by completing the square.

Now that we have rewritten the exponent, we can plug it back in. \begin{align} \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}+sx}dx & = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-(\mu+\sigma^2 s))^2 - (\mu+\sigma^2 s)^2 + \mu^2}{2\sigma^2}}dx \\ & = e^{\frac{(\mu+\sigma^2 s)^2 - \mu^2}{2\sigma^2}}\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-(\mu+\sigma^2 s))^2}{2\sigma^2}}dx \\ & = e^{s\mu+\frac{1}{2}s\sigma^2}\int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-(\mu+\sigma^2 s))^2}{2\sigma^2}}dx \end{align} The integral is over the pdf of a normal random variable with mean \(\mu+\sigma^2 s\) and variance \(\sigma^2,\) so it evaluates to \(1.\) This gives the result. \[M(s) = e^{s\mu+\frac{1}{2}s\sigma^2}\]

Corollary: If \(X\) has a Normal\((\mu, \sigma^2)\) distribution, then \(\text{Var}(X) = \sigma^2.\)

Proof
We have shown that \(E[X] = \mu\) but did not show \(\text{Var}(X) = \sigma^2.\) Use \(M(s) = e^{s\mu+\frac{1}{2}\sigma^2s^2}\) to find \(E[X^2].\) \begin{align} & M'(s) = (\mu + \sigma^2 s)e^{s\mu+\frac{1}{2}\sigma^2s^2} \\ & M''(s) = \sigma^2 e^{s\mu+\frac{1}{2}\sigma^2s^2} + (\mu + \sigma^2 s)^2e^{s\mu+\frac{1}{2}\sigma^2s^2} \\ & M''(0) = \sigma^2 + \mu^2 \end{align} Therefore, \(E[X^2] = \sigma^2 + \mu^2\) and \[\text{Var}(X) = E[X^2]-E[X]^2 = \sigma^2 + \mu^2 - \mu^2 = \sigma^2\]

Check your understanding:

1. The random variable \(X\) is normally distributed with mean \(1\) and variance \(2.\) What is \(E[X^3]?\)




Unanswered

2. The pmf of \(X\) is \[p(-2) = 0.3, p(1) = 0.4, p(4) = 0.3\] What is the moment generating function of \(X?\)




Unanswered