Two equivalent definitions of almost sure convergence of random variables.

The former implies $$ \begin{eqnarray} 0 &=& P(\lim_{i\to\infty}X_i \neq X\, \text{or} \lim_{i\to\infty}X_i \, \text{does not exists}) \\ &=& P(\omega:\exists n\in\mathbf{N}, \forall m\in\mathbf{N}, \exists i>m \,\,\, \text{s.t.} \,\, |X_i(\omega) - X(\omega)| < 1/n)\\ &=& P(\bigcup_n \bigcap_m \bigcup_{i>m} \{\omega:|X_i(\omega) - X(\omega)| \ge 1/n\})\\ &\overset{\forall n}{\ge}& P(\bigcap_m \bigcup_{i>m} \{\omega:|X_i(\omega) - X(\omega)| \ge 1/n\})\\ &=&P(\limsup_{i\to\infty} \{\omega:|X_i(\omega) - X(\omega)| \ge 1/n\}) \\ &=&P(\lim_{i\to\infty} \sup_{j > i}\{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\})\\ &=&\lim_{i\to\infty} P(\sup_{j > i}\{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\}) \ge 0.\\ \end{eqnarray} $$ The last line is justified by continuity of measures from above (see here). Now let's prove the converse. For some $n\in\mathbf{N}$, $$ \begin{eqnarray} 0&=&\lim_{i\to\infty} P(\sup_{j > i}\{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\})\\ &=&P(\lim_{i\to\infty} \sup_{j > i}\{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\})\\ &=&P(\limsup_{i\to\infty} \{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\})\\ &=&P(\bigcap_i\bigcup_{j>i} \{\omega:|X_j(\omega) - X(\omega)| \ge 1/n\})\\ &=&P(\omega:\forall i, \exists j>i, \text{s.t.}\, |X_j(\omega) - X(\omega)| \ge 1/n) \end{eqnarray} $$ This implies, for each $n$, $$ \begin{eqnarray} 1 &=& P(\omega: \exists i, \forall j>i, \text{s.t.}\, |X_j(\omega) - X(\omega)| < 1/n)\\ &=&P(\omega:\lim_{i\to\infty}|X_j(\omega) - X(\omega)| = 0)\\ &=&P(\omega:\lim_{i\to\infty}X_j(\omega) = X(\omega)) \end{eqnarray} $$ Actually, I think this proof may need to be polished. But I hope it is not wrong in a bird's eye view at least.


Well, let's denote by $Y_n$ the variable $\sup_{k>n} |X_k - X|$. Observe that

  1. $(Y_n)$ is a decreasing sequence of nonnegative random variables
  2. $Y_n \to 0$ iff $X_n \to X$
  3. "$\mathsf{P}\{Y_n > \epsilon\} \to 0$ for all $\epsilon$" means that $Y_n \to 0$ in probability

Now everything follows from the simple but useful fact: if a $(Y_n)$ is a monotone sequence, then its convergence in probability implies almost sure convergence (that the converse is true for any sequence is an extremely standard fact). This can be proved by different means, but the easiest proof that I know of is as follows:

Any monotone sequence converges (almost surely) to something (this is just standard calculus, and "almost surely" is actually irrelevant). Hence it also converges in probability to the same limit, hence if $Y_n \to Y$ in probability, then $Y$ must coincide with the almost sure limit.