Problem on convergence random variable
Let $Y_1, Y_2,\ldots$ be independent random variables such that $Y_n\sim \mathcal{N}(\mu, \sigma^2_n)$, where the sequence $\sigma^2_n \rightarrow \sigma > 0.$
Want to show that $Y_n \overset{\text{p}}\nrightarrow Y$. To show this result, I assume that $Y_n \overset{\text{p}}\rightarrow Y$ and consider
$$|Y_n-Y_{n+1}|\le |Y_n-Y|+|Y_{n+1}-Y|$$
Thus, I want to establish a contradiction from the above inequality. I will be glad to get some help on that.
You need the extra condition $\sigma >0$. If $(X_n)$ is i.i.d. $N(0,1)$ then $Y_n=\frac 1 n X_n$ gives a counter-example to your statement when $\sigma =0$.
When $\sigma>0$ we can argue as follows: Let $Y_n \to Y$ in probability. Extract a subsequence $(Y_{n_k})$ which converges almost surely to $Y$. Since $Y_{n_i}, Y_{n_{i+1}},Y_{n_{i+2}}, \cdots $ converges to $Y$ it follows that $Y$ is indepndent of $Y_{n_{k}}$ for $k \leq i-1$. This is true for each $i$. Conclude that $Y$ is independent of itself, so $Y$ is a constant. [Note: If you are familiar with Kolmogorov's $0-1$ law you can immedaitely conclude that $Y$ is a.s. constant (since it is tail-measurable)]. But convergence of $(Y_n)$ to $Y$ in distribution implies (via characteristic functions) that $0=Var (Y)=\lim Var (Y_n)=\lim \sigma_n^{2}$, a contradiction.