How slowly can a series grow to be convergent?
This may be a poorly worded question, but I hope to flesh out my ideas well. The takeaway here is this: some series diverge to infinity while others converge to a fixed value. Take the two classical examples: the harmonic series and the Basel Problem
$$\displaystyle\sum_{n=1}^\infty \frac{1}{n}= \infty \, , \hspace{0.6cm} \displaystyle\sum_{n=1}^\infty\frac{1}{n^2} = \frac{\pi^2}{6}$$
On one end, we have a divergent series, whereas on the other end, we have a convergent series, both series of which seem eerily similar, except for the square in the latter. A question I might raise would be: at what "rate of growth" (loosely speaking) does a series have to grow to tip over from the point of convergence to sudden, chaotic divergence?
Yes, in Calculus, you learn about various convergence tests that allow one to test whether a given series is convergent, but I am wondering if there is a famous "rate of growth" that a series must "exceed" in order to indisputably diverge.
Solution 1:
It turns out that there is no perfect dividing line: given any convergent series (of positive terms), one can find another convergent series whose terms grow faster, while given any divergent series, one can find another divergent series whose terms grow slower.
However, for practical purposes, it can be helpful to note that the series $$ \sum_{n=2}^\infty \frac1{n\log n},\; \sum_{n=3}^\infty \frac1{n\log n\log\log n},\; \sum_{n=16}^\infty \frac1{n\log n\log\log n\log\log\log n},\; \dots $$ all diverge, while the series $$ \sum_{n=2}^\infty \frac1{n(\log n)^{1+\varepsilon}},\; \sum_{n=3}^\infty \frac1{n\log n(\log\log n)^{1+\varepsilon}},\; \sum_{n=16}^\infty \frac1{n\log n\log\log n(\log\log\log n)^{1+\varepsilon}},\; \dots $$ all converge for every $\varepsilon>0$. (You can verify these assertions all using the Integral Test!)