Summation of variations $\sigma^2$ of several independent random signals

Is it possible to prove that for several independent random signals (variables) their variations $\sigma^2$ are added?

I found it here, but without proof.


Solution 1:

As long as you know that $\mathbb{E}(XY)=\mathbb{E}(X)\mathbb{E}(Y)$, it is straightforward from the definition of variation.

\begin{align} \sigma^2_{X+Y} &=\mathbb{E}((X+Y)^2)-(\mathbb{E}(X+Y))^2\\ &=\mathbb{E}(X^2)+2\mathbb{E}(XY)+\mathbb{E}(Y^2)-\mathbb{E}(X)^2-2\mathbb{E}(X)\mathbb{E}(Y)-\mathbb{E}(Y)^2 \\ &=\mathbb{E}(X^2)-\mathbb{E}(X)^2+\mathbb{E}(Y^2)-\mathbb{E}(Y)^2+2\mathbb{E}(XY)-2\mathbb{E}(X)\mathbb{E}(Y)\\ &=\sigma_x^2+\sigma_y^2 \end{align}

I hope this helps.