Is this proof for linearity of Expectation correct?
I'm studying the first properties of expectation and on my notes I have a proof for its linearity. However my notes arent fully clear so I tried reinterpreting them, and this is the result:
Linearity of Expectation (proof):
Let $X$ and $Y$ be random variables, and $\text{Image}(X) = \{s_1, s_2, \dots, s_k, \dots \}$, and $\text{Image}(X) = \{t_1, t_2, \dots, t_k, \dots \}$ where both sets have at most countably many elements (the random variables are discrete). Let's put $P_i = \mathbb{P}( X = s_i)$ and $Q_j = \mathbb{P}(Y = t_j)$ and eventually $\pi_{ij} = \mathbb{P}[ (X = s_i)\cap(Y = t_j)]$ which is the probability that $X = s_i$ and meanwhile $Y = t_j$.
We have that $\text{Image}(X+Y) = \{ s_1+t_1, s_2+t_1, \dots, s_2+t_1, s_2 + t_2, \dots, s_k +t_1, s_k + t_2, \dots \}$.
So by definition of expectation:
$E[X+Y] = \sum\limits_{i=0}^{\infty}\sum\limits_{j=0}^{\infty}(s_i+t_j)\mathbb{P}(X+Y = s_i+t_j) = \sum\limits_{i=0}^{\infty}\sum\limits_{j=0}^{\infty}(s_i+t_j)\pi_{ij}$
Using distributive property of multiplication
$E[X+Y] = \sum\limits_{i=0}^{\infty}\sum\limits_{j=0}^{\infty}s_i\pi_{ij} + \sum\limits_{i=0}^{\infty}\sum\limits_{j=0}^{\infty}t_j\pi_{ij} = \sum\limits_{i=0}^{\infty}s_i\sum\limits_{j=0}^{\infty}\pi_{ij} + \sum\limits_{j=0}^{\infty}t_j\sum\limits_{i=0}^{\infty}\pi_{ij}$
We observe now that by definiton $\pi_{ij}$ is $\mathbb{P}[(X = s_i)\cap(Y = t_j)]$
and therefore
$\sum\limits_{j=0}^\infty\pi_{ij} = \sum\limits_{j=0}^\infty\mathbb{P}[(X = s_i)\cap(Y = t_j)]$
Where $Y = t_i$ is the same as $T^{-1}(t_j) = \{ \mathcal{D_j}\in\Omega \;\;|\;\; X(\mathcal{D}_j) = t_j\}$ and $\Omega$ is the sample space. Because $Y$ is a function $\mathcal{D_j}$ are disjoint and so they make a partition for $\Omega$, i.e. $\bigcup\limits_{j=0}^\infty\mathcal{D}_j = \Omega$.
Using additivity for measures
$\sum\limits_{j=0}^\infty\pi_{ij} = \mathbb{P}( \bigcup\limits_{j=0}^\infty[(X = s_i) \cap (Y = t_j) ] = \mathbb{P}[ (X = s_i) \cap\bigcup\limits_{j=0}^\infty Y = t_j ] = \mathbb{P}[ (X = s_i) \cap \Omega ] = \mathbb{P}(X = s_i) $
By reproducing the same procedure for $\sum\limits_{i=0}^\infty\pi_{ij}$ we find out that
$\sum\limits_{i=0}^\infty\pi_{ij} = \mathbb{P}( Y = t_j )$
Substituting in $E[X+Y] = \sum\limits_{i=0}^{\infty}s_i\sum\limits_{j=0}^{\infty}\pi_{ij} + \sum\limits_{j=0}^{\infty}t_j\sum\limits_{i=0}^{\infty}\pi_{ij}$
we get:
$E[X+Y] = \sum\limits_{i=0}^{\infty}s_i\mathbb{P}(X = s_i) + \sum\limits_{j=0}^{\infty}t_j\mathbb{P}(Y = t_j) = E[X] + E[Y]$.
Is it correct? My biggest doubt is where we state that $\mathbb{P}(X+Y = s_i+t_j) = \pi_{ij}$ cause I expect that there could be more than one way to get $X+Y = s_i+t_j$, and not only if $X = s_i$ and $Y = t_j$.
Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).
You make use of the distributions of $X,Y,X+Y$ but that is not necessary.
There is a route closer to definitions and working for all integrable random variables (not only discrete).
Let $X,Y$ be random variables defined on probability space $(\Omega,\mathcal A,\mathsf P)$.
Then also $Z=X+Y$ prescribed by $\omega\mapsto X(\omega)+Y(\omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:
$$\mathbb EZ:=\int Z(\omega)\mathsf P(d\omega)=\int X(\omega)+Y(\omega)\mathsf P(d\omega)=\int X(\omega)\mathsf P(d\omega)+\int Y(\omega)\mathsf P(d\omega)=\mathbb EX+\mathbb EY$$
Essential is just the fact that for integrable functions $f,g$ : $$\int f+g\;d\mu=\int f\;d\mu+\int g\;d\mu$$for any measure $\mu$.