Prove that if $X$ is stochastically larger than $Y$ then $E(X)\ge E(Y)$

Prove that if $X$ is stochastically larger than $Y$ (i.e. $P(X > t) \ge P(Y > t)$ then $E(X)\ge E(Y)$.

I understand how to solve the problem if $X$ and $Y$ are non-negative random variables because you can use the fact that $E(X)=\int_{0}^{\infty} P\{X>t\}$. I do not know how to solve this in a more general case. I found a solution that tells me to prove that $Y^-$ is stochastically larger than $X^-$ and that $X^+$ is stochastically larger than $Y^+$. I do not know how to prove $Y^-$ is stochastically larger than $X^-$.

You can write $E(X)=E(X^+ -X^-)$ and for $Y$ $E(Y)=E(Y^+ -Y^-)$ then use linearity of the expected value to get the solution. I am just missing a step in my proof.


Solution 1:

Actually you need the opposite result, i.e. that $Y^-$ is stochastically larger than $X^-$. Note that if $P(X> t)\ge P(Y> t)$ then for $t> 0$ you have that $$\begin{align*}P(X^-\ge t)&=P(\max\{-X,0\}\ge t)=1-P(\max\{-X,0\}<t)\overset{t>0}=\\&=1-P(-X< t)=1-P(X> -t)\\&\le 1-P(Y>-t)=1-P(-Y< t)=P(\max\{-Y,0\}\ge t)=P(Y^-\ge t)\end{align*}$$ and for $t\le0$ you have that $$P(X^-\ge t)=1=P(Y^-\ge t)$$ by definition. Combining these two $X^-$ is stochastically smaller than $Y^-$ but that is exactly what you want since $$E[X]=E[X^+-X^-]=E[X^+]-E[X^-]\ge E[Y^+]-E[Y^-]=E[Y^+-Y^-]=E[Y]$$ where the $\ge$ is justified from your proof for non-negative random variables since $X^+,X^-, Y^+, Y^-$ are non-negative by definition.