IID Random Variables that are not constant can't converge almost surely

I am trying to prove the following.

If $\{ X_n \}$ are iid random variables and not constant, then $R:=P\{ \omega \mid X_n(\omega)\text{ converges} \}=0$

Using independence I know that by Kolmogorov's 0-1 law, that if $R$ is not $0$ then $R=1$. So I think the way to do this proof is by contradiction. So I am trying to show $R=1$ implies the $X_n$ are constant using their identical distribution but sadly it is not working. Help would be appreciated. Thanks!


Solution 1:

$\left\{\omega: \lim X_n(\omega) \le c \right\}$ is in the tail $\sigma$-algebra for every $c$. So by Kolmogorov's $0-1$ law they have to converge to a constant if they converge.

But this can't happen if $X_n$ are not constant. Because in this case for any constant $c$ we have some $\epsilon > 0$ so that

$P(X_n(\omega) < c - \epsilon) \ge \delta > 0$ for every $n$

or $P(X_n(\omega) > c + \epsilon) \ge \delta > 0$ for every $n$.

In either case, we see that $X_n(\omega)$ cannot converge to $c$ almost surely.