A nonnegative random variable has zero expectation if and only if it is zero almost surely

Let $Y$ be a non-negative random variable. Prove that $E(Y) = 0$ if and only if $P(Y=0)=1$.

My understanding is that while you can prove it for discrete $Y$, the result is true for all $Y$.


Solution 1:

If $Y$ is a non-negative random variable defined on a probability space $\Omega$ and

$$E(Y) = \int_{\Omega} Y dP=0$$

then $Y = 0$ almost surely and $P(\{\omega \in \Omega: Y(\omega)=0\})=1$.

Proof: For any $m \in \mathbf{N}$, let

$$E_m = \{\omega \in \Omega:Y(\omega) > 1/m\}$$

then, since $Y$ is non-negative, we have $Y = Y1_{\Omega} \geq Y 1_{E_m}$ and then

$$0 = \int_{\Omega} Y dP \geq \int_{E_m} Y dP\geq \frac1{m}P(E_m) \geq 0,$$

and $P(E_m) = 0$.

So

$$0 \leq P(\{\omega \in \Omega:Y(\omega) \neq 0\})= P\left(\bigcup E_m\right) = \lim_{m \rightarrow \infty}P(E_m)=0$$

Hence

$$P(\{\omega \in \Omega:Y(\omega) \neq 0\})=0 \implies P(\{\omega \in \Omega:Y(\omega) = 0\})=1 $$ QED

Conversely, if $Y=0$ a.s. then $E(Y) = \int Y dP = 0$

Solution 2:

Sufficiency is obvious. For necessity, assume for contradiction that $E[Y] = 0$ and $P(Y > 0) = c > 0$. From total expectation, we have:

$$ 0 = E[Y] \ge P(Y>0)E[Y\mid Y >0] = cE[Y \mid Y>0] > 0. $$