Deducing a probability distribution from its moment-generating function

The question is to inverse a Laplace/Fourier transform. I take the example of a discrete distribution $f(n)$ on the natural numbers with moment-generating-function $$ M(t) = \sum_{n=0}^\infty f(n)e^{nt} $$ with radius of convergence $R \geq 1$. Fourier inversion here is $$ f(n) = \frac{1}{2\pi}\int_0^{2\pi}M(i\theta)e^{-in\theta}\,d\theta $$

If you prefer to stay in the real realm, there is an interesting formula due to Post.

A related formula would be: $$ f(n) = \left.\frac{d^n}{dt^n}M(\log t)\right|_{t=0} $$


I know that this question is old, but all the answers provided thus far only dealt with the continuous variable cases. In the edit, @user54609 states that he/she was asking how to get the probability distribution of a discrete random variable, and for that case, the answer is surprisingly simple.

The only difference is that it is easier to do what you want using the characteristic function $G(s)$, instead of the moment generating function $E[e^{tx}]$. It is easy to convert between them by looking at the definition of $G(s)$:

$$ G(s) = \sum_{n=0}^{+\infty}p(n)s^{n} $$ where $n$ is the discrete random variable, and $p(n)$ is its probability distribution.Thus, you can convert between $G$ and the m.g.f. by setting $s=e^{t}$.

Now, lets suppose that we know $G(s)$ but not $p(n)$, and we want to find out the functional form of $p(n)$ using what we know about $G(s)$. The way to do that is by evaluating derivatives of $G(s)$ at $s=0$ as:

$$ p(n)=\frac{1}{n!}\left.\frac{d^{n} G(s)}{d s^{n}}\right|_{s=0} $$

For example, the characteristic function of a Poisson distributed random variable is $G(s)=e^{\lambda(s-1)}$. We can recover the probability distribution of the Poisson random variable $n$ as

$$ \frac{1}{n!}\left.\frac{d^{n} G(s)}{d s^{n}}\right|_{s=0}=\left.\frac{\lambda^{n}}{n!}e^{(s-1)\lambda}\right|_{s=0} = \frac{\lambda^{n}}{n!}e^{-\lambda} = p(n) $$

You can ty this with other example distributions, such as the Binomial, whose characteristic function is $G(s)=\left[1+(s-1)p\right]^{N}$ to gain some trust in the method, but it works for all discrete random variables.


Let $\mathcal{M}(g)(s) = \int_0^{\infty} x^{s-1} g(x) dx$ be the Mellin transform; then the moment-generating function of a smooth enough p.d.f $f$ is given by $\mathcal{M}(f(-\log(x))(-s)$;

so given a nice enough moment-generating function $h(s) = E[e^{sX}] = \int_{-\infty}^{\infty} e^{sx} f(x) dx$, we recover $f$ as $$f(x) = \mathcal{M}^{-1}(h(-s))(-e^x)$$ where $\mathcal{M}^{-1}$ is given by the Mellin inversion theorem: $$\mathcal{M}^{-1}h(x) = \frac{1}{2\pi i}\int_{c - i\infty}^{c + i \infty} x^{-s} h(s) ds$$ for an appropriate real number $c$, where the integral is understood to be along a line in $\mathbb{C}$.


Yes, you can. First, convert your mgf into a characteristic function (i.e. replace $t \rightarrow it$). Next, invert the characteristic function to yield the pdf using an inverse Fourier transform.