Continuous Mapping Theorem for Random Variables

Solution 1:

BeerR's proof

The proof is correct, although justifying convergence of your first term is not completely trivial.

Pick a sequence $\delta_m \searrow 0$ and $\varepsilon >0$ fixed. Define

$$ A_m := \{\omega: |g(X_n(\omega))-g(X(\omega))| > \varepsilon, |X_n(\omega)-X(\omega)| <\delta_m \text{ for some } n\} $$

Clearly $A_{m'} \subset A_m$ for $m' > m$.

We will show that $A_m \searrow \emptyset$ as $m \to \infty$. Then by continuity of probability measures we will get $\Pr(A_m) \searrow 0$ as $m \to \infty$.

For this fix $\omega$:

By continuity of $g$ at $X(\omega)$, there exists a $\delta$ such that $|g(y) - g(X(\omega))| < \varepsilon$ for all $y$ with $|y-X(\omega)| < \delta$.

In particular, pick $M$ such that $\delta_M < \delta$, then $\omega \notin A_M$. Since $\omega$ was arbitrary it follows that:

$$ \bigcap_{m=1}^{\infty} A_m = \emptyset $$

This shows that the first term is negligible, i.e. for all $n$:

$$ \Pr(\{\omega: |g(X_n(\omega))-g(X(\omega))| > \varepsilon, |X_n(\omega)-X(\omega)| <\delta_m \}) \leq \Pr(A_m)$$

Now, in BeerR's expression, one can first take $n \to \infty$ and then $m \to \infty$ to get the wanted result.

Did's comment

If $g$ is uniformly continuous, then things simplify quite a bit:

For fixed $\varepsilon > 0$ choose $\delta >0$ such that $|g(x)-g(y)| < \varepsilon$ for all $x,y$ with $|x-y| < \delta$.

Then: $ |g(X_n(\omega))-g(X(\omega)| \geq \varepsilon$ implies $|X_n(\omega) - X(\omega)| \geq \delta$.

Therefore:

$$ \Pr(|g(X_n)-g(X)| \geq \varepsilon) \leq \Pr( |X_n - X| \geq \delta) $$

The RHS now goes to $0$ since $X_n$ converges to $X$ in probability ($\delta >0$) .