If $\mathrm P(X=k)=\binom nkp^k(1-p)^{n-k}$ for a binomial distribution, then from the definition of the expected value $$\mathrm E(X) = \sum^n_{k=0}k\mathrm P(X=k)=\sum^n_{k=0}k\binom nkp^k(1-p)^{n-k}$$ but the expected value of a Binomal distribution is $np$, so how is

$$\sum^n_{k=0}k\binom nkp^k(1-p)^{n-k}=np$$


The main idea is to factor out $np$. I believe we can rewrite:

$$\sum^n_{k=0}k\binom nkp^k(1-p)^{n-k}= \sum^n_{k=1} k\binom nkp^k(1-p)^{n-k}$$

Factoring out an $np$, this gives (and cancelling the $k$'s):

$$\sum^n_{k=1} k\binom nkp^k(1-p)^{n-k} = np \sum^n_{k=1} \dfrac{(n-1)!}{(n-k)!(k-1)!}p^{k-1}(1-p)^{n-k}$$

Notice that the RHS is:

$$np \sum^n_{k=1} \dfrac{(n-1)!}{(n-k)!(k-1)!}p^{k-1}(1-p)^{n-k} = np \sum^n_{k=1} \binom {n-1}{k-1}p^{k-1}(1-p)^{n-k},$$

and since $\displaystyle \sum^n_{k=1} \binom {n-1}{k-1}p^{k-1}(1-p)^{n-k} = (p + (1-p))^{n-1} = 1$, we therefore indeed have

$$\sum^n_{k=0}k\binom nkp^k(1-p)^{n-k} = np$$.


Let $B_i=1$ if we have a success on the $i$-th trial, and $0$ otherwise. Then the number $X$ of successes is $B_1+B_2+\cdots +B_n$. But then by the linearity of expectation, we have $$E(X)=E(B_1+B_2+\cdots+B_n)=E(B_1)+E(B_2)+\cdots +E(B_n).$$ It is easy to verify that $E(B_i)=p$, so $E(X)=np$.

You wrote down another expression for the mean. So the above argument shows that the combinatorial identity of your problem is correct. You can think of it as a mean proof of a combinatorial identity.

Remark: A very similar argument to the one above can be used to compute the variance of the binomial.

The linearity of expectation holds even when the random variables are not independent. Suppose we take a sample of size $n$, without replacement, from a box that has $N$ objects, of which $G$ are good. The same argument shows that the expected number of good objects in the sample is $n\dfrac{G}{N}$. This is somewhat unpleasant to prove using combinatorial manipulation.