Expected Value of Non-negative integer random variable: Author's proof?

I'm looking back in an rather old textbook I picked up a few years ago, "An Introduction to Probability and its Applications" by Larsen and Marx (published in 1985) and was looking at the following problem:

Question 5.2.5 a. If $X$ is a nonnegative, integer-valued random variable, show that $$E(X) = \sum_{k=1}^\infty P(X \ge k)$$ Written on page 237

There are many versions of this question asked online here on Math.SE and the solution I had learned before was an argument similar to the accepted answer in Intuition behind using complementary CDF to compute expectation for nonnegative random variables.

However, the back of the book gives a hint for a proof that looks to take a different approach and I can't seem to follow along with where the author is going here.

The hint given is as follows:

Hint Add the complements of $P(X \ge k)$ for several values of $k$ and use the fact that $\sum_{k=0}^\infty P(X=k) = 1$

Maybe I'm so used to the proof I've learned that I'm overseeing something simple, or perhaps there's another way to prove that I'm just not seeing. I'm also not able to tell which side of the equation the author is intending the reader to start on (would we start with LHS or RHS to go with the method of proof utilizing the hint?)

Any insight or clues into the author's solution (or what we think it is) would be greatly appreciated.


Solution 1:

Perhaps this was the idea the author intended; write

$$ 1-P(X\geq 1)=P(X=0)$$

$$ 1-P(X\geq 2)=P(X=0)+P(X=1)$$

$$\vdots$$

$$ 1-P(X\geq N)=P(X=0)+P(X=1)+...+P(X=N-1)$$

Summing the LHS above and using the probabilities summing to 1 gives

$$ N-\sum_{k=1}^{N} P(X\geq k)=N[\sum_{k=0}^{N-1} P(X=k)+\sum_{k=N}^{\infty} P(X=k)]-\sum_{k=1}^{N} P(X\geq k),$$

while summing the RHS above gives

$$(N-0)\cdot P(X=0)+(N-1)\cdot P(X=1)+...+(N-(N-1))\cdot P(X=N-1)$$

$$=N\sum_{k=0}^{N-1}P(X=k)-\sum_{k=0}^{N-1} kP(X=k),$$

so equating the two and rearranging gives

$$ \sum_{k=1}^{N} P(X\geq k)=N\sum_{k=N}^\infty P(X=k)+\sum_{k=0}^{N-1} kP(X=k),\quad [1]$$

and letting $N \uparrow \infty$ gives

$$\sum_{k=1}^{\infty} P(X\geq k)=0+E[X],$$

as desired, where the vanishing term of [1] is by the squeeze theorem since

$$N\sum_{k=N}^\infty P(X=k)\leq \sum_{k=N}^\infty k P(X=k)= E[X]-\sum_{k=0}^{N-1}k P(X=k).$$