Is the variance of a sum of infinitely many independent random variables the sum of their variances?

For $X_i$ independent, is $\operatorname{Var}\left(\sum \limits_{i = 0}^\infty X_i \right) = \sum\limits_{i=0}^\infty \operatorname{Var}(X_i)$?

Thanks!


Yes, as soon as the RHS is finite and the series $\sum\limits_{n}\mathrm E(X_n)$ converges.

To see this, assume without loss of generality that the random variables $X_n$ are centered with variances $\sigma_n^2$ and that the series $\sum\limits_n\sigma_n^2$ converges, with $\sigma^2$ as its sum. Let $S_n=\sum\limits_{k\leqslant n}X_k$ and note that, for every $n\leqslant m$, $$ \mathrm E((S_m-S_n)^2)=\sum\limits_{k=n+1}^m\sigma_k^2, $$ which converges to zero when $n\to\infty$, hence $(S_n)_n$ is a Cauchy sequence in $L^2$. Let $S$ denote its limit in $L^2$. Then $\mathrm E(S_n^2)\to\mathrm E(S^2)$ when $n\to\infty$ and, for every $n$, $$ \mathrm E(S_n^2)=\sum\limits_{k\leqslant n}\sigma_k^2, $$ hence $\mathrm E(S^2)=\sigma^2$. Since $S_n\to S$ in $L^2$, a subsequence converges almost surely to $S$. Kolmogorov's inequality proves that the whole sequence converges almost surely to $S$, hence $S_n\to S$ in the almost sure sense as well and the proof is complete.

If the sum of the series $\sum\limits_n\sigma_n^2$ is infinite, $(S_n)_n$ diverges almost surely.