How to prove that convergence in MGF implies Convergence in Distribution?

Solution 1:

I'm not really sure what you mean by "two distributions" and then use the word "converges". Here's what I assume: let $\mu_n$ and $\mu$ be probability measures on the real half-line, $[0,\infty)$. Let $L_n$ and $L$ be their moment generating functions, $L_n(t) = \int_{x \geq 0} \exp(-tx) d\mu(x).$ We want to know whether $L_n(t) \rightarrow L(t)$ for every $t \geq 0$ implies $\mu_n \Rightarrow \mu$, where $\Rightarrow$ denotes weak convergence. (Equivalently for measures on the line, $\Rightarrow$ means $F_n(x) \rightarrow F(x)$ for any continuity point of $F$.) I'll sketch a proof of this statement, which I learned from Billingsley's Convergence of Probability measures, Example 5.5.

The idea of the proof is an application of Prohorov's theorem. A family of probability measures $\mathcal{F}$ is said to be tight if for any $\epsilon > 0$, there is a compact set $K \subset \mathbb{R}$ such that $$\mu(K) > 1-\epsilon$$ for all $\mu \in F$.

Prohorov's theorem says that if the family $\mathcal{F}$ is tight, then it is relatively compact. That is, for any sequence $\{\mu_n\}_{n \geq 1} \subset \mathcal{F}$, there is a subsequence $\mu_{n_k}$ such that $\mu_{n_k} \Rightarrow \mu$ for some probability measure $\mu$ in the closure of $\mathcal{F}$.

The moment generating function of a random variable $X$ with distribution $\mu$ on $\mathbb{R}_{\geq 0}$ is given by $$L(t) = \int_0^\infty e^{-tx}d\mu(x).$$ Suppose that $L_n(t) \rightarrow L(t)$ pointwise for all $t \geq 0$. Note that \begin{align*} \frac{1}{u}\int_0^u (1 - L(t)) dt &= \frac{1}{u} \int_{x \geq 0}\int_0^u 1 - e^{-tx} dt d\mu(x) \tag{$\int d\mu = 1$}\\ &\geq \frac{1}{u} \int_{x \geq 1/u} \int_0^u 1 - e^{-tx} dt d\mu(x) \\ &\geq \frac{1}{u} \int_{x \geq 1/u} \int_0^u 1 - e^{-t/u} dt d\mu(x) \tag{monotonicity} \\ &= \int_{x \geq 1/u} e^{-1} d\mu(x) = e^{-1}\mu((1/u, \infty)). \end{align*} Note that by continuity of $L(t)$ at $t = 0$, we can choose a $u_0$ so small as to make $$B_u(L) = u^{-1}\int_0^u 1-L(t) dt$$ as small as we please. Moreover, since $L_n(t) \rightarrow L(t)$, we can make $|B_u(L) - B_u(L_n)|$ as small as we please for sufficiently large $n$ (because of Dini's theorem : pointwise convergence implies uniform convergence when the limiting function is continuous). Pick such a large $N$, and for $1 \leq n \leq N$, choose $u_n > 0$; take the max over these along with $u_0$ and call it $u$. We then choose the compact set $[0,1/u]$. This proves the family of measures is tight, and so there is a subsequence $F_{n_i}(x)$ which converges to some cdf $G(x)$ at all continuity points of $G$. It remains to show that $G$ is in fact $F$.

However, since the mgf $L_G$ of $G$ is also the limit of the $L_n$ (by assumption), it follows that $L_G = L$ and therefore $G = F$ by uniqueness of mgf.