How to prove: Moment Generating Function Uniqueness Theorem

Let us first clarify the assumption. Denote the moment generating function of $X$ by $M_X(t)=Ee^{tX}$.

Uniqueness Theorem. If there exists $\delta>0$ such that $M_X(t) = M_Y(t) < \infty$ for all $t \in (-\delta,\delta)$, then $F_X(t) = F_Y(t)$ for all $t \in \mathbb{R}$.

To prove that the moment generating function determines the distribution, there are at least two approaches:

  • To show that finiteness of $M_X$ on $(-\delta,\delta)$ implies that the moments $X$ do not increase too fast, so that $F_X$ is determined by $(EX^k)_{k\in\mathbb{N}}$, which are in turn determined by $M_X$. This proof can be found in Section 30 of Billingsley, P. Probability and Measure.

  • To show that $M_X$ is analytic and can be extended to $(-\delta,\delta)\times i\mathbb{R} \subseteq \mathbb{C}$, so that $M_X(z)=Ee^{zX}$, so in particular $M_X(it)=\varphi_X(t)$ for all $t\in\mathbb{R}$, and then use the fact that $\varphi_X$ determines $F_X$. For this approach, see Curtiss, J. H. Ann. Math. Statistics 13:430-433 and references therein.

At undergraduate level, it is interesting to work with the moment generating function and state the above theorem without proving it. The proof requires far more advanced mathematics than undergraduate level.

In fact, the proof is so advanced that, at such a point it usually makes more sense to accept working with complex numbers, forget about moment generating function and work with the charachteristic function $\varphi_X(t)=Ee^{itX}$ instead. Almost every graduate textbook takes this path and proves that the characteristic function determines the distribution as a corollary of the inversion formula.

This proof of the inversion formula is bit long, but it only requires Fubini Theorem to switch an expectation with an integral and Dominated Convergence Theorem to switch an integral with a limit. A direct proof of uniqueness without inversion formula is shorter and simpler, and it only requires Weierstrass Theorem to approximate a continuous function by a trigonometric polynomial.

Side remark. If you only admit random variables whose support are contained in $\mathbb{Z}_+$, then the probability generating function $G_X(z)=Ez^X$ determines $p_X$ (and thus $F_X$). This elementary result is proved in most undergraduate textbooks and is mentioned in Did's answer. If you only admit random variables whose support are contained in $\mathbb{Z}$, then it is simpler to show that $\varphi_X$ determines $p_X$, as also mentioned in Did's answer, and the proof uses Fubini.


$$(\forall n\geqslant0)\qquad \left.\frac{\mathrm d^n}{\mathrm ds^n}\mathbb E[s^X]\right|_{s=0}=n!\cdot\mathbb P[X=n] $$ $$(\forall x\in\mathbb R)\qquad \int_0^{2\pi}\mathbb E[\mathrm e^{\mathrm itX}]\,\mathrm e^{-\mathrm itx}\,\mathrm dt=2\pi\cdot\mathbb P[X=x] $$


In the case where $X$ has density function $\phi(x)$, $$ M_X(it) = E(e^{itX}) = \int_{-\infty}^\infty e^{itx}\phi(x)dx, $$ which is the Fourier transform of $\phi(x)$. Therefore $\phi(x)$ can be recovered from its MGF using the Fourier inversion formula.

The function $M_X(it)$ is called the characteristic function of $X$. See Chapter 6 of Kai Lai Chung's book A Course in Probability Theory for more details.