Entropy of a binomial distribution
Solution 1:
This answer follows roughly the suggestion of @MichaelLugo in the comments.
We are interested in the sum $$H = -\sum_{k=0}^n {n\choose k}p^k(1-p)^{n-k} \log_2\left[{n\choose k}p^k(1-p)^{n-k} \right].$$ For $n$ large we can use the de-Moivre-Laplace theorem, $$H \simeq -\int_{-\infty}^\infty dx \, \frac{1}{\sqrt{2\pi}\sigma} \exp\left[-\frac{(x-\mu)^2}{2\sigma^2}\right] \log_2\left\{\frac{1}{\sqrt{2\pi}\sigma} \exp\left[-\frac{(x-\mu)^2}{2\sigma^2}\right] \right\},$$ where $\mu = n p$ and $\sigma^2 = n p(1-p)$. Thus, $$\begin{eqnarray*} H &\simeq& \int_{-\infty}^\infty dx \, \frac{1}{\sqrt{2\pi}\sigma} \exp\left[-\frac{(x-\mu)^2}{2\sigma^2}\right] \left[\log_2(\sqrt{2\pi}\sigma) + \frac{(x-\mu)^2}{2\sigma^2} \log_2 e \right] \\ &=& \log_2(\sqrt{2\pi}\sigma) + \frac{\sigma^2}{2\sigma^2} \log_2 e \\ &=& \frac{1}{2} \log_2 (2\pi e\sigma^2) \end{eqnarray*}$$ and so $$H \simeq \frac{1}{2} \log_2 \left[2\pi e n p(1-p)\right].$$ Higher order terms can be found, essentially by deriving a more careful (and less simple) version of de-Moivre-Laplace.