Proof that a sum of Bernoulli rvs has Binomial distribution

It's really a lot more straightforward than what you're making it out to be. The first one is simply asking you to condition on the outcome of the $(n+1)^{\rm th}$ Bernoulli trial. That is to say, let $\{ X_i \}_{i \ge 1}$ be a sequence of IID ${\rm Bernoulli}(p)$ variables, and define $S_k = \sum_{i=1}^k X_i$ be their partial sums. Then $S_{n+1} = S_n + X_{n+1}$, and it is easy to see that if $S_{n+1} = k$, then either $S_n = k$ and $X_{n+1} = 0$, or $S_n = k-1$ and $X_{n+1} = 1$. So use the law of total probability, conditioning on the outcome of $X_{n+1}$, to get the probability of $S_{n+1} = k$ via induction.

For the second method, use moment generating functions. Recall that the MGF of a Bernoulli distribution is $$M_X(t) = {\rm E}[e^{tX}] = e^{t \cdot 1} \Pr[X = 1] + e^{t \cdot 0} \Pr[X = 0] = (1-p) + pe^t.$$ Also recall that the sum of $n$ IID random variables has MGF equal to the $n{\rm th}$ power of the MGF of the individual variable, so the MGF of the sum is $$M_{S_n}(t) = (M_X(t))^n = (1-p + pe^t)^n.$$ Now compute the binomial MGF directly from its PMF, and if is the same, you have just proven the result.


Using generating functions: Let $P$ be the generating function for one Bernoulli trial, i.e. for one $X_i$: we have $$P(s) = q + ps,$$ as the outcome is $X_i = 0$ with probability $q$, and $1$ with probability $p$.

Then the generating function for the sum of $n$ trials is $$P(s)^n = (q + ps)^n = \sum_{r=0}^{n}\binom{n}{r}(ps)^r q^{n-r},$$ in which the coefficient of $s^k$ is $\binom{n}{k}p^k q^{n-k}$, which is the generating function for a binomial distribution.