Explain why $E(X) = \int_0^\infty (1-F_X (t)) \, dt$ for every nonnegative random variable $X$
Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show, $$E(X) = \int_0^\infty (1-F_X (t)) \, dt$$ when $X$ has : a) a discrete distribution, b) a continuous distribution.
I assumed that for the case of a continuous distribution, since $F_X (t) = \mathbb{P}(X\leq t)$, then $1-F_X (t) = 1- \mathbb{P}(X\leq t) = \mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.
For every nonnegative random variable $X$, whether discrete or continuous or a mix of these, $$ X=\int_0^X\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\gt t}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\geqslant t}\,\mathrm dt, $$ hence, by applying Tonelli's Theorem,
$$ \mathrm E(X)=\int_0^{+\infty}\mathrm P(X\gt t)\,\mathrm dt=\int_0^{+\infty}\mathrm P(X\geqslant t)\,\mathrm dt. $$
Likewise, for every $p>0$, $$ X^p=\int_0^Xp\,t^{p-1}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\gt t}\,p\,t^{p-1}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\geqslant t}\,p\,t^{p-1}\,\mathrm dt, $$ hence
$$ \mathrm E(X^p)=\int_0^{+\infty}p\,t^{p-1}\,\mathrm P(X\gt t)\,\mathrm dt=\int_0^{+\infty}p\,t^{p-1}\,\mathrm P(X\geqslant t)\,\mathrm dt. $$
Copied from Cross Validated / stats.stackexchange:
where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.
The function $x1[x>0]$ has derivative $1[x>0]$ everywhere except for $x=0$, so by a measurable version of the Fundamental Theorem of Calculus $$ x1[x>0]=\int_0^{x}1[t>0]\ dt=\int_0^{\infty}1[x>t]\ dt,\qquad \forall x\in\mathbb R. $$ Applying this identity to a non-negative random variable $X$ yields $$ X=\int_0^{\infty}1[X>t]\ dt,\quad a.s. $$ Taking expectations of both sides and using Fubini to interchange integrals, $$ \mathbb EX=\int_0^{\infty}\mathbb P(X>t)\ dt. $$