What does expected value of sum of two discrete random variables mean?
I am confused with summing two random variables. Suppose $X$ and $Y$ are two random variables denoting how much is gained from each two games. If two games are played together, we can gain $E[X] + E[Y]$ in total. I understand until here. However, in many textbooks, the equation $E[X+Y]=E[X]+E[Y]$ is given as an explanation to expectation of playing two games together. Explanation is more difficult than the result.
What does $X+Y$ and $E[X+Y]$ mean? We define $E[X]=\sum X_ip_i$. So, do we define $E[X+Y]=\sum (X_i+Y_i)p_i$ where $p_i$ is the same for both random variables.
What if $X$ denotes the equally likely outcomes $1, 2, 3$ and $Y$ denotes the equally likely outcomes $1, 2, 3, 4, 5$?
Solution 1:
When you are talking about two random variables, you need to think about their joint distribution - so, rather than talking about $P(X=i)$, you need to talk about $P(X=i\text{ and }Y=j)$, or, as we usually write it, $P(X=i,Y=j)$.
If it helps, think of it as randomly choosing a vector with two components - then calling the first component $X$ and the second component $Y$. You can think of $X$ and $Y$ as the separate outcomes of two experiments - which may or may not be related. So, $X$ could be how much you win in the first hand of poker, and $Y$ how much you win in the second. Then $X+Y$ is how much you won in the first two hands together.
With this in hand, for a function $f(x,y)$, we can define (for variables that take discrete values), $$ \mathbb{E}[f(X,Y)]=\sum_{x,y}f(x,y)\cdot P(X=x, Y=y). $$ So, in your particular case, $$ \mathbb{E}[X+Y]=\sum_{x,y}(x+y)P(X=x,Y=y)=\sum_{x,y}xP(X=x,Y=y)+\sum_{x,y}yP(X=x,Y=y). $$ Consider the first of these sums. Note $$ \sum_{x,y}xP(X=x,Y=y)=\sum_{x}x\sum_{y}P(X=x,Y=y). $$ The inner sum here is precisely $P(X=x)$: the event "$X=x$" is the same as the event "$X=x$ and $Y$ takes any value", whose probability is exactly this sum. So, $$ \sum_{x,y}xP(X=x,Y=y)=\sum_{x}x\sum_{y}P(X=x,Y=y)=\sum_{x}xP(X=x)=\mathbb{E}[X]. $$ Similarly, $$ \sum_{x,y}yP(X=x,Y=y)=\mathbb{E}[Y], $$ and combining these gives the formula $$ \mathbb{E}[X+Y]=\mathbb{E}[X]+\mathbb{E}[Y]. $$
Solution 2:
Good answers from @nrpeterson and @Lord_Farin.
For a practical demonstration consider an unbiased random number generator $i \in 6$, more commonly known as a dice.
Let $X$ be the result of throwing this dice once (an event), and $Y$ be the result of a subsequent throw (another independent event).
If you consider the sample space of each event they are the same and are:
$$\begin{array}{|c|c|c|c|c|c|c|} \hline \text{Result} & 1 & 2 & 3 & 4 & 5 & 6 \\ \hline \text{Probablity} & \frac{1}{6} & \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6} \\ \hline \end{array}$$
As you say, $E[X]=\sum X_ip_i$, so it is easy to show that $E(X)=E(Y)=3.5$. Please note that the expected value is not actually a result that is achievable; this is not uncommon. For what this means you are moving into philosophy rather than mathematics as discussed here. Trivially $E(X)+E(Y)=7$
So what is $X+Y$? Clearly it is an integer $\in [2,12]$ but unlike $X$ and $Y$ not all outcomes are equally likely. Consider the joint distribution of $X+Y$:
$$\begin{array}{c|c|c|c|c|c|c|c|} & X & 1 & 2 & 3 & 4 & 5 & 6 \\ \hline Y \\ \hline 1 & & 2 & 3&4&5&6&7 \\ \hline 2&&3&4&5&6&7&8 \\ \hline 3&&4&5&6&7&8&9 \\ \hline 4&&5&6&7&8&9&10 \\ \hline 5&&6&7&8&9&10&11 \\ \hline 6&&7&8&9&10&11&12 \\ \hline \end{array}$$
There are 36 possibilities that give rise to the 11 possible outcomes, the most common being 7 with $p=\frac{1}{6}$ and the least common being 2 and 12 with $p=\frac{1}{36}$. The expected value $E[X+Y]=\sum(X_i+Y_i)p_i$ can be calculated and is 7, so $E[X+Y]=E[X]+E[Y]$.