What is the intuitive difference between almost sure convergence and convergence in probability? [duplicate]
Solution 1:
For simplicity, consider the case where $X = 0$ and $X_n$ is the indicator function of an event $E_n$. "$X_n$ converges almost surely to $0$" says that with probability $1$, only finitely many of the events $E_n$ occur. "$X_n$ converges in probability to $0$" says that the probability of event $E_n$ goes to $0$ as $n \to \infty$.
Consider a case where for each $m$ you partition the sample space into $m$ events, each of probability $1/m$, and take all these events for all $m$ to form your sequence $E_n$. Then $X_n \to 0$ in probability because the probabilities of the individual events go to $0$, but each sample point is in infinitely many $E_n$ (one for each $m$) so $X_n$ does not go to $0$ almost surely.