Intuition about infinite sums

I am particularly concerned about these two sums: $$\sum_{x=\color{red}0}^\infty \frac{1}{2^x} = 1 + \frac{1}{2} + \frac{1}{4} ... = 2$$

and

$$\sum_{x=1}^\infty \frac{1}{x} = 1 + \frac{1}{2} + \frac{1}{3} ... = \infty $$

Now, I know and I understand the proofs that say the first sum converges to two and the second one is divergent. ( you probably shouldn't equate a divergent series like that ).

But on an intuitive level, what separates these two sums in such a way, that their limits are so drastically different?

Both sums add very small numbers in the end ( e.g. $\frac{1}{10000000000}$ ), yet one always stays smaller than two and one goes way beyond that. You would think ( again, on an intuitive level, the math is clear) that the numbers become so small that there is a certain number that doesn't change anymore (significantly).

For example:

$$\sum_{x=1}^{100} \frac{1}{x} = ca. 5 $$

$$\sum_{x=1}^{1000000} \frac{1}{x} = ca. 14.3 $$

$$\sum_{x=1}^{10000000000000000000000000000000} \frac{1}{x} = ca. 73.5 $$

As one can see, the growth drastically slows down, at some point it should became infinitely small, just like with first sum ( fraction, powers of two ).


This is related to how fast the terms decrease.

A geometric series (your $1/2^x$) is such that every term is a constant fraction of the previous, so that dividing by this constant is the same as dropping the first term.

$$\frac12\left(1+\frac12+\frac14+\frac18\cdots\right)=\frac12+\frac14+\frac18\cdots$$ So you can write

$$\frac12S=S-1$$ and deduce $S=2$.

The same reasoning applies to all geometric series

$$\sum_{k=0}^\infty r^k$$

provided that $r<1$. Indeed, if $r=1$ or $r>1$, the sum clearly grows forever. (This simplified discussion ignores the case $r<0$.)

This leads to a simple convergence criterion: if the ratio of successive terms is a constant less than $1$, the series converges. More generally, if this ratio is variable but tends to a limit smaller than $1$, the series converges.

Conversely, if the ratio tends to a limit larger than $1$, the series diverges. But if the ratio tends to $1$, we don't know, the criterion is insufficient.

The case of the harmonic series ($1/n$) or the generalized harmonic series ($1/n^p$) precisely falls in this category, as

$$\lim_{n\to\infty}\left(\frac{n}{n+1}\right)^p=1.$$

To deal with it, a trick is to sum the terms in groups of increasing size (by doubling), so that the sums exceed a constant. More precisely,

$$\begin{gather} 1,\\ \frac12,\\ \frac13+\frac14 > \frac14+\frac14 = \frac12,\\ \frac15+\frac16+\frac17+\frac18 > \frac18+\frac18+\frac18+\frac18 = \frac12,\\ \cdots \end{gather}$$

Though the groups get longer and longer, you can continue forever and the sum grows to infinity.

If you repeat the reasoning with exponent $p$,

$$\begin{gather} 1,\\ \frac1{2^p},\\ \frac1{3^p}+\frac1{4^p} > \frac1{4^p}+\frac1{4^p} = \frac2{4^p}=\frac1{2^{2p-1}},\\ \frac1{5^p}+\frac1{6^p}+\frac1{7^p}+\frac1{8^p} > \frac1{8^p}+\frac1{8^p}+\frac1{8^p}+\frac1{8^p} = \frac4{8^p} = \frac1{2^{3p-2}},\\ \cdots \end{gather}$$

In this new series, the ratio of successive terms tends to $2^{p-1}$ and by the first criterion, you can conclude convergence for $p>1$ and divergence for $p<1$. (A complete discussion must involve a similar upper bound, omitted here.)


To summarize, by decreasing order of decrease rate

$$\sum r^n, r<1\text{ converges}$$ $$\sum \frac1{n^p}, p>1\text{ converges}$$ $$\sum \frac1{n^p}, p=1\text{ diverges}$$ $$\sum \frac1{n^p}, p<1\text{ diverges}$$ $$\sum r^n, r=1\text{ diverges}$$ $$\sum r^n, r>1\text{ diverges}$$

For other series, you can compare to these decrease rates. For example, with the general term $1/n!$, the limit of the ratio is $\lim_{n\to\infty}n!/(n+1)!=0$ and the series converges, faster than any geometric series. Or $1/\sqrt[3]{n^2+1}$ makes a diverging series because the general term tends to $1/n^{2/3}$.

The curves below shows the trend of the terms of the sequences on a logarithmic scale. The green one corresponds to the harmonic series, which is a border between convergent and divergent series.

enter image description here


To help your intuition about why it doesn't suffice that the individual terms get arbitrary small, consider the following series: \begin{aligned} a &= \sum_{k=1}^{\infty} 2^{-\lfloor \log_2 k\rfloor}\\ &= 1 + \underbrace{\frac12 + \frac12}_{=1} + \underbrace{\frac14+\frac14+\frac14+\frac14}_{=1} + \underbrace{\frac18+\frac18+\frac18+\frac18+\frac18+\frac18+\frac18+\frac18}_{=1} + \ldots \end{aligned} Clearly the individual terms of this series get arbitrary small, but there are sufficiently many of them that the sum still adds up to an arbitrary large number; indeed for any positive integer $n$, the first $2^n-1$ terms add up to $n$.

So for the series to converge, it does not suffice that the terms get arbitrary small, they have to get small fast enough that their growing count doesn't overcompensate their diminishing value.


The difference is not in how small the terms are getting... as OP says, in both cases they become arbitrarily small... but in how many terms there are of a particular size. For instance, say two terms have the same size if their first nonzero binary digits are in the same place. In the first sum, you have one term of size $1$, one of size $1/2$, one of size $1/4$, one of size $1/8$, etc. In the second sum, you have one term of size $1$, one of size $1/2$, but then two of size $1/4$ ($1/3$ and $1/4$), four of size $1/8$ ($1/5$ through $1/8$), and so on. Each time the size is halved, the number of terms of that size is doubled; so the contribution from terms of each size never decreases, and the magnitude of the sum marches steadily to infinity.