Convergence in distribution and in probabiliity
Let $(X_n)_n$ be a sequence of independent random variables. Let $Y_n=\sum_n X_n.$ Suppose that $(Y_n)_n$ doesn't converges in probability then we can find $\epsilon>0,$ sequences $(p_n)_n$ and $(q_n)_n$ in $\mathbb{N},$ verifying $p_n > q_n \geq n,$ and $\forall n \in \mathbb{N},P(|Y_{q_n}-Y_{p_n}|>\epsilon)>\epsilon.$ This also means that $W_n=Y_{q_n}-Y_{p_n}$ doesn't converges in probability to $0$, and then doesn't converges in distribution to $0$, which means we can find $x_0 \in \mathbb{R},\varphi_{W_n}(x_0)$ doesn't converges to $1$, $Y_{q_n}=W_n+Y_{p_n}$ ($W_{n}$ and $Y_{p_n}$ are independent) so if we write $\varphi_{Y_{q_n}}(x_0)=\varphi_{W_n}(x_0)\varphi_{Y_{p_n}}(x_0),$ so how can we deduce that $Y_n$ doesn't converges in distribution? (Can we use Cauchy criterion, I mean by proving that $\varphi_{Y_{p_n}}(x_0)-\varphi_{Y_{q_n}}(x_0)$ doesn't converges to $0$.)
Assume $Y_n$ converges in distribution. Then, for all $t\in\mathbb{R}$, $\varphi_{Y_n}(t)\to \varphi(t)$ for some characteristic function $\varphi$. By continuity of $\varphi$, there exists a neighborhood $N$ of 0 where $\varphi(t)\neq 0$ for all $t\in N$. Then, for such $t$'s, $$\frac{\varphi_{Y_{q_n}}(t)}{\varphi_{Y_{p_n}}(t)}\to 1.$$ (By the way, I am assuming here $q_n>p_n$, as implicitly done in the 2nd half of the original post) This implies that $\varphi_{W_n}(t)\to 1$ for all such $t$. But then (see this post), $W_n$ converges in distribution to 0. Thus, $W_n$ converges in probability to 0, a contradiction.