Proof of linearity for expectation given random variables are dependent

The proof of linearity for expectation given random variables are independent is intuitive. What is the proof given there they are dependent?

Formally, $$ E(X+Y)=E(X)+E(Y)$$ where $X$ and $Y$ are dependent random variables.

The proof below assumes that $X$ and $Y$ belong to the sample space. That is, they map from the sample space to a real number line. Is that also a condition for linearity of expectation?

Proof: $$E(X+Y) =\sum\limits_{s}(X+Y)(s) P({s}) $$ $$E(X+Y) =\sum\limits_{s}(X(s)+Y(s)) P({s}) $$ $$E(X+Y) =\sum\limits_{s} X(s)P({s}+ \sum\limits_{s} Y(s)P({s}) $$ $$E(X+Y) =E(X)+E(Y)$$ Here $S$ is the sample space and $s$ is an event in the sample space.

Reference Lecture for proof.

Also, more reasoning for step 2 would be helpful. I don't understand it completely.


Solution 1:

The proof below assumes that $X$ and $Y$ belong to the sample space. That is, they map from the sample space to a real number line. Is that also a condition for linearity of expectation?

No.   It's the definition of a random variable.

Basically any random variable $X$ is a function that maps the sample space to the reals (or a subset there of, called the support).   $$X: \Omega \mapsto \Bbb R$$

If $X$ and $Y$ are both random variables of the same sample space, then so is their sum. $X+Y$.   (That is not defined if they are not of the same sample space.)  

$$ X:\Omega\mapsto\Bbb R~\wedge~ Y:\Omega\mapsto \Bbb R ~~\implies~~ X+Y:\Omega\mapsto\Bbb R\\\forall s\in\Omega,\quad(X+Y)(s) := X(s)+Y(s)$$

Linearity of Expectation then follows from its definition.

$\begin{align} \mathsf E(X+Y) =&~ \sum_{\omega\in\Omega} (X+Y)(\omega)~\mathsf P(\omega) \\[1ex] =&~ \sum_{\omega\in \Omega} X(\omega)~\mathsf P(\omega)+\sum_{\omega\in \Omega} Y(\omega)~\mathsf P(\omega) \\[1ex] =&~ \mathsf E(X)+\mathsf E(Y) \end{align}$

Of course, this is for discrete random variables.   For continuous random variables we use integration , but everything is analogous by no coincidence.

$\begin{align} \mathsf E(X+Y) =&~ \int_{\Omega} (X+Y)(\omega)~\mathsf P(\mathrm d \omega) \\[1ex] =&~ \int_{\Omega} X(\omega)~\mathsf P(\mathrm d \omega)+\int_{\Omega} Y(\omega)~\mathsf P(\mathrm d \omega) \\[1ex] =&~ \mathsf E(X)+\mathsf E(Y) \end{align}$