Proof for convergence in distribution implying convergence in probability for constants

I'm trying to understand this proof (also in the image below) that proves if $X_{n}$ converges to some constant $c$ in distribution that this implies it converges in probability too. Specifically, my questions about the proof are:

  1. How are they getting $\lim_{n \to \infty} F_{X_{n}}(c+\frac{\epsilon}{2}) = 1$?

  2. Why do they state the conclusion at the end in this way? They're basically saying that knowing $lim_{n \to \infty}P(|X_{n} - c| \geq \epsilon) \geq 0$ allow you to conclude that $lim_{n \to \infty}P(|X_{n} - c| \geq \epsilon) = 0$ but the real reason we can conclude this is because of the whole body of the proof above, right?

enter image description here


Solution 1:

If a sequence of random variables $X_n$ converges to $X$ in distribution, then the distribution functions $F_{X_n}(x)$ converge to $F_X(x)$ at all points of continuity of $F_X$.

In this case $X=c$, so $F_X(x)=0$ if $x<c$ and $F_X(x)=1$ if $x\geq c$. $F_X$ is continuous everywhere except at $x=c$, hence $$ \lim_{n\to\infty}F_{X_n}\Big(c+\frac{\varepsilon}{2}\Big)=F_X\Big(c+\frac{\varepsilon}{2}\Big)=1 $$ as claimed.

For the second part, the argument has shown that the limit $\leq 0$, and the point the book is making (somewhat clumsily) is that the limit is of course non-negative, so these two facts imply that the limit is zero.