Good upper bound for $\sum\limits_{i=1}^{k}{n \choose i}$?

Solution 1:

We are interested in the sum $$V(n,k) := \sum\limits_{i=0}^k \binom{n}{i}.$$ (Notice that I have added $i=0$ for convenience. This affects the sum only by an additive $1$.) Indeed, this quantity has a combinatorial significance: it is the volume of a "Hamming ball" of radius $k$ in the Hamming space (the $n$-dimensional hypercube). It is quite difficult to say anything precise without information about the relative sizes of $k$ and $n$. I will therefore address some common regimes of interest in the answer. Without loss of generality, we will take $k \leq n/2$, since for $k > n/2$, we can write $V(n, k)$ as $2^n - V(n, n-k+1)$.

First, suppose that $n$ grows while $k$ is a fixed constant independent of $n$. Then the lower order terms (terms corresponding to $i < k$) are all $O(n^{k-1})$. Since there are just $k$ of them, we can absorb all of them into a single $O(n^{k-1})$. The remaining term $\binom{n}{k}$ is a polynomial in $n$ of degree $k$ and leading coefficient $\frac{1}{k!}$. Therefore, this term is $\frac{n^{k}}{k!}+O(n^{k-1})$. Combining these two observations, $$ V(n,k) = \frac{n^{k}}{k!} + O(n^{k-1}). $$


Another regime of interest is when $k$ is large. Specifically, suppose $k = \alpha n$ where $\alpha \in (0, 1/2)$. In this case, by Stirling approximation and a lot of calculations, we can show that $$ V(n, \alpha n) = 2^{n H(\alpha) - \frac{1}{2} \log_2 n + O(1)}, $$ where the exponent $H(\alpha)$ is the Shannon entropy or the binary entropy given by: $$ H(\alpha) = - \alpha \log_2 \alpha - (1-\alpha) \log_2 (1-\alpha). $$ For $\alpha \in (0, 1/2)$, $H(\alpha)$ is strictly increasing and is strictly positive. I want to point out that when $k$ is so large, the $n^k$ upper bound is quite crummy: $n^k = 2^{k \log n} = 2^{\alpha n \log n}$. This is a useless upper bound because we already know a much better upper bound of $2^n$. (For a reference to this estimate of $V(n, \alpha n)$, check Problem 9.42 in Concrete Mathematics. In many common cases, one is however content with a less precise form, such as $2^{n H(\alpha)+o(n)}$.)

There are, of course, more things that one can ask: generalization to the $q$-ary hypercube, intermediate values of $k$ (e.g., $k = O(\log n)$), values of $k$ close to $n/2$ (e.g., $\frac{n}{2} - c \sqrt{n}$) and so on. Each of these is a fruitful direction of study, but I cannot touch upon any of them here.

EDIT: The precise estimate is taken from Mike's comment. (Thanks, Mike!)

Solution 2:

Let $k$ be fixed. Then, in the "big Oh" sense, the bound is tight.

For example, let $\epsilon$ be a very small positive number. It is easy to verify that $$\lim_{n \to \infty} \frac{n^{k-\epsilon}}{\binom{n}{k}}=0.$$ So we cannot replace $n^{k}$ by $n^{k-\epsilon}$. We cannot even replace $n^k$ by something that grows almost as fast as $n^k$, such as $n^k/(\log(\log n))$.

So for fixed $k$, we will not find a "cheaper" upper bound in the "big Oh" sense. Of course, we can improve the implicit constant, by dividing by $k!$ (the terms $\binom{n}{i}$ with $i<k$ make negligible relative contribution when $n$ is large).