Linearity of convergence in distribution of random variables

Solution 1:

Suppose that $X_n$ converges in distribution to $X$ where $X$ is a symmetric random variable, say $X \sim N(0,1)$. Then, trivially, $X_n$ also converges in distribution to $-X$ (since $X$ and $-X$ are identically distributed). However, $X_n + X_n$ does not converge in distribution to $X+(-X)=0$.

Solution 2:

Convergence in distribution is a pretty weak concept.. Suppose you consider probability distributions on $[0,1]$. Let $X = Y = X_n$ for all $n$ have a density supported on $[0,1/2]$ alone, and let $Y_n$ be the same distribution except shifted to the right by $1/2$. Then all these random variables have the same distribution, so convergence in distribution is automatic. But also each $X_n + Y_n$ is the same, but different from $X + Y$, so you won't get convergence in distribution.