Collection of standard facts about convergence of random variables
$L^p$-convergence implies convergence in probability.
Partial converse: If $X_n \to X$ in probability and $(X_n)_{n \in \mathbb{N}}$ is $L^p$-bounded, then $X_n \to X$ in $L^q$ for $q<p$.
If $X_n \to X$ almost surely and $\mathbb{E}(|X_n|)\to \mathbb{E}(|X|)<\infty$, then $X_n \to X$ in $L^1$.
Counterexamples for implications which fail in general to hold (a.e. convergence does not imply convergence in probability and so on)
If $X_n \stackrel{d}{\to} X$ and $Y_n \stackrel{\mathbb{P}}{\to} c$ for a constant $c$, then $X_n Y_n \stackrel{d}{\to} cX$. (Slutsky's theorem I). Note: If $c=0$ then we even get convergence in probability (since $X_n Y_n \stackrel{d}{\to} 0=cX$ implies $X_n Y_n \stackrel{\mathbb{P}}{\to} 0$), see also this question.
If $X_n \stackrel{d}{\to} X$ and $Y_n \stackrel{d}{\to} c$ for a constant $c$, then $X_n+Y_n \stackrel{d}{\to} X+c$. (Slutsky's theorem II)
$X_n \stackrel{d}{\to} X$ and $Y_n \stackrel{d}{\to} Y$ does, in general, not imply $X_n+Y_n \stackrel{d}{\to} X+Y$.
If $X_n \to X$ and $Y_n \to Y$ in probability, then the product $X_n Y_n$ converges in probability to $X \cdot Y$.
If $X_n \to c \neq 0$ in probability, then $1/X_n \to 1/c$ in probability.
If $X_n \to X$ in distribution and $c_n \to c$, then $c_n X_n \to cX$ in distribution.
If $X_n \to X$ in distribution and $X_n$ is Gaussian for each $n \in \mathbb{N}$, then $X$ is Gaussian.
If $X_n$ is a Cauchy sequence in measure, then there exists a random variable $X$ such that $X_n \to X$ in probability.
$X_n \to X$ in probability if, and only if, $\mathbb{E}\min\{|X_n-X|,1\} \to 0$ as $n \to \infty$.
Convergence in probability implies almost everywhere convergence of a subsequence.
If $S_n = \sum_{j=1}^n X_j$ is a sum of independent random variables which converges in probability, then $S_n$ converges almost surely (Lévy's equivalence theoem).
If $S_n = \sum_{j=1}^n X_j$ is a sum of independent random variables which converges in distribution, then $S_n$ converges almost surely.
Almost sure convergence is not metrizable.
- Convergence in probability implies convergence in distribution.
- The converse is true if the limit is a constant.
- Convergence in probability is preserved by continuous maps.
- Here is a thread dealing with a link between almost sure convergence and convergence in $\mathbb L^1$. It shows that if $\lim_{n \rightarrow \infty} \mathbb{E}|X_n| \rightarrow \mathbb{E}|X| < \infty$, then $X_n \rightarrow X$ in $L^1$ i.e. $\Bbb E[|X_n-X|] \to 0$.
- If $(X_n)$ and $X$ are random variables such that $X_n \to X$ in distribution and such that $\{X_n\mid n\geq 1\}$ is uniformly integrable, then $E[X_n]\to E[X]$.
- Combining facts 1., 4. and 5. we get after a reasoning on subsequences that convergence in probability combined with uniform integrability is equivalent to convergence in $\mathbb L^1$.
- Convergence a.s. implies almost uniform convergence by Egoroff's Theorem. Similarly, convergence in probability will imply almost uniform convergence of a subsequence.