linearity of expectation in case of dependent events
I could understand the linearity of expectation in case of independent events, but why does it work in case of dependent events too. It seems counter - intuitive. In case of dependent events, each outcome influences subsequent outcomes, hence they cannot be just summed up to get expectation without taking care of this inter dependence.
Can anybody give an intuitive representation of why linearity works even in case of dependent events.
By definition \begin{align*} \mathbb{E}\left[\alpha f\left(X\right)+g\left(Y\right)\right] & =\int_{\omega}\alpha f\left(X\left(\omega\right)\right)+g\left(Y\left(\omega\right)\right)d\mathbb{P}\left(\omega\right)\\ & =\alpha\int_{\omega}f\left(X\left(\omega\right)\right)d\mathbb{P}\left(\omega\right)+\int g\left(Y\left(\omega\right)\right)d\mathbb{P}\left(\omega\right)\\ & =\alpha\mathbb{E}\left[f\left(X\right)\right]+\mathbb{E}\left[g\left(X\right)\right]. \end{align*}
For a concrete example, consider a discrete random variable $X$ that corresponds to rolling a fair 3-sided die. \begin{align*} \mathbb{E}\left[X+X^{2}\right] & =\frac{1}{3}\left(1+1^{2}\right)+\frac{1}{3}\left(2+2^{2}\right)+\frac{1}{3}\left(3+3^{2}\right)\\ & =\frac{1}{3}\left(1+2+3\right)+\frac{1}{3}\left(1^{2}+2^{2}+3^{2}\right)\\ & =\mathbb{E}\left[X\right]+\mathbb{E}\left[X^{2}\right]. \end{align*}