When are probability distributions completely determined by their moments?

Roughly speaking, if the sequence of moments doesn't grow too quickly, then the distribution is determined by its moments. One sufficient condition is that if the moment generating function of a random variable has positive radius of convergence, then that random variable is determined by its moments. See Billingsley, Probability and Measure, chapter 30.

A standard example of two distinct distributions with the same moment is based on the lognormal distribution:

f0(x) = (2π)1/2 x-1 exp(-(log x)2/2).

which is the density of the lognormal, and the perturbed version

fa(x) = f0(x) (1 + a sin (2π log x))

These have the same moments; namely the nth moment of each of these is exp(n2/2).

A condition for a distribution over the reals to be determined by its moments is that lim supk → ∞2k)1/2k/2k is finite, where μ2k is the (2k)th moment of the distribution. For a distribution supported on the positive reals, lim supk → ∞k)1/2k/2k being finite suffices.

This example is from Rick Durrett, Probability: Theory and Examples, 3rd edition, pp. 106-107; as the original source for the lognormal Durrett cites C. C. Heyde (1963) On a property of the lognormal distribution, J. Royal. Stat. Soc. B. 29, 392-393.


As has been mentioned in previous answers, the moments do not uniquely determine the distributions unless certain conditions are satisfied, such as bounded distributions. One thing you can say, is that the distribution of a random variable $X$ is uniquely determined by the characteristic function $\phi_X(a)=E[\exp(iaX)]$. Letting $m_n=E[X^n]$ be the $n^{th}$ moment, this can be expanded as

$$\phi_X(a)=\Sigma_n \frac{i^na^nX^n}{n!}$$

which is valid within its radius of convergence. So, the moments will uniquely determine the distribution as long as this has infinite radius of convergence, which is the case as long as $$\lim_{n\rightarrow\infty}\sup\left|\frac{m_n}{n!}\right|^{\frac{1}{n}}=0.$$ Stirling's formula simplifies it a bit to limsupn→∞|mn|1/n/n=0. This can be proven using the dominated convergence theorem.

For example, a distribution is bounded by K if |mn|≤Kn, which satisfies this condition.

On the other hand, it is possible to construct distinct distributions supported in the positive integers and with the same moments. To do this, you need to find a sequence of real numbers cn satisfying Σncnnr=0 for all r (and converging absolutely). This doesn't involve anything more than solving some linear equations to solve this for any finite set of powers r. Then, by keeping adding more terms to extend to all positive integers r, you get the infinite sequence cn. The two distributions can then be obtained by taking the positive and negative parts of cn.


This sounds like one of the classical "moment problems" that have been much studied, although I'm afraid I don't know the literature. Wikipedia suggests that the term to look for is Hamburger moment problem

A quick Google also throws up an article by Stoyanov which ought to have some examples of non-uniqueness and pointers to the literature.

As you might know, if we know in advance that the density is confined to some bounded interval (say [-1,1] for sake of argument), then the moments do indeed determine the density. (This basically follows because the density is determined by its values when integrated against continuous functions, and continuous functions on a closed bounded interval can be approximated to arbitrary accuracy by polynomials)