How to estimate the growth of a "savage" function near 1?

Solution 1:

If the coefficients of the series are positive and can be described by a nice formula then you can get pretty far by comparing the series to an integral.

For example, if the terms of the series are eventually strictly decreasing and the series has radius of convergence $1$ with a singularity at $x=1$ then

$$ \sum_{n=0}^{\infty} a_n x^n \sim \int_0^\infty a_n x^n \,dn $$

as $x \to 1^-$. This can be proved using the idea behind the integral test for convergence.

Applying this to the first two series yields

$$ \sum_{n=0}^{\infty} x^n \sim \int_0^\infty x^n\,dn = -\frac{1}{\log x} \sim \frac{1}{1-x} \tag{1} $$

and

$$ \sum_{n=0}^{\infty} (n+2)x^n \sim \int_0^\infty (n+2)x^n\,dn = \frac{1}{(\log x)^2} - \frac{2}{\log x} \sim \frac{1}{(1-x)^2} \tag{2} $$

as $x \to 1^-$. Note that in both cases we used the fact that

$$ \log x = x-1 + O\left((x-1)^2\right) $$

as $x \to 1$.

A similar argument leads to the asymptotic

$$ \sum_{n=1}^{\infty} \frac{1}{n^s} \sim \int_1^\infty \frac{dn}{n^s} = \frac{1}{s-1} \tag{3} $$

as $s \to 1^+$.

Sometimes the resulting integral can't be done in closed form but we can still obtain an asymptotic after some additional analysis. To address another of your examples let's study the estimate

$$ \sum_{n=0}^{\infty} x^{b^n} \sim \int_0^\infty x^{b^n}\,dn = \int_0^\infty \exp\Bigl[-b^n (-\log x)\Bigr]\,dn \tag{4} $$

where $b > 1$ is fixed. Making the change of variables $(-\log x) b^n = t$ yields

$$ \int_0^\infty \exp\Bigl[-b^n (-\log x)\Bigr]\,dn = \frac{1}{\log b} \int_{-\log x}^\infty e^{-t}t^{-1}\,dt. \tag{5} $$

The integral blows up as $-\log x$ approaches zero. For $t \approx 0$ the integrand is

$$ e^{-t} t^{-1} \approx t^{-1}, $$

so we expect that the integral has a logarithmic singularity here. We'll proceed by pulling out this term from the integral:

$$ \begin{align} &\int_{-\log x}^\infty e^{-t}t^{-1}\,dt \\ &\qquad = \int_{-\log x}^1 e^{-t}t^{-1}\,dt + \int_{1}^\infty e^{-t}t^{-1}\,dt \\ &\qquad = \int_{-\log x}^1 t^{-1}\,dt + \int_{-\log x}^1 \left(e^{-t}-1\right)t^{-1}\,dt + \int_{1}^\infty e^{-t}t^{-1}\,dt \\ &\qquad = -\log(-\log x) + \int_{-\log x}^1 \left(e^{-t}-1\right)t^{-1}\,dt + \int_{1}^\infty e^{-t}t^{-1}\,dt. \end{align} $$

The first integral in the last expression converges as $-\log x \to 0$, so the only unbounded term is the first. Thus

$$ \int_{-\log x}^\infty e^{-t}t^{-1}\,dt \sim -\log(-\log x) $$

as $x \to 1^-$. By combining this with $(5)$ we get

$$ \int_0^\infty \exp\Bigl[-b^n (-\log x)\Bigr]\,dn \sim -\log_b(-\log x) $$

and so, returning to the original sum through $(4)$ and once again using the asymptotic $\log x \sim x-1$, we have arrived at the conclusion that

$$ \sum_{n=0}^{\infty} x^{b^n} \sim -\log_b(1-x) \tag{6} $$

as $x \to 1^-$.


What follows has been added in response to the comments below.

The series $\sum_p x^p$, where $p$ ranges over the prime numbers, is more tricky to deal with. If we call the $n^\text{th}$ prime $p_n$ then it is known that

$$ p_n \sim n\log n $$

as $n \to \infty$. If we knew ahead of time that

$$ \sum_{n=1}^{\infty} x^{p_n} \sim \sum_{n=1}^{\infty} x^{n\log n} \tag{7} $$

as $x \to 1^-$ then we could directly obtain an asymptotic equivalent for $\sum_p x^p$ by studying the behavior of the integral $\int_1^\infty x^{n\log n}\,dn$. Unfortunately I don't know how to prove $(7)$ directly. I've actually asked a question about the topic here. We can, however, proceed by using the idea presented in an answer to that posted question.

(Interestingly the equivalence $(7)$ will be a corollary of our calculations. Combine $(8)$ with $\lambda = 1$ with $(10)$.)

First, by comparing the series with the corresponding integral it's possible to show that, for $\lambda > 0$ fixed,

$$ \sum_{n=1}^{\infty} x^{\lambda n \log n} \sim \frac{1}{\lambda(x-1)\log(1-x)} \tag{8} $$

as $x \to 1^-$.

Fix $0 < \epsilon < 1$ and choose $N \in \mathbb N$ such that

$$ \left|\frac{p_n}{n\log n} - 1\right| < \epsilon $$

for all $n \geq N$. For $0 < x < 1$ we have

$$ \sum_{n=N}^{\infty} x^{(1+\epsilon)n\log n} < \sum_{n=N}^{\infty} x^{p_n} < \sum_{n=N}^{\infty} x^{(1-\epsilon)n\log n}. $$

By completing the three series we see that the above inequality is equivalent to

$$ \begin{align} &\sum_{n=1}^{\infty} x^{(1+\epsilon)n\log n} + \sum_{n=1}^{N} \left(x^{p_n} - x^{(1+\epsilon)n\log n}\right) \\ &\qquad < \sum_{n=1}^{\infty} x^{p_n} \\ &\qquad < \sum_{n=1}^{\infty} x^{(1-\epsilon)n\log n} + \sum_{n=1}^{N} \left(x^{p_n} - x^{(1-\epsilon)n\log n}\right). \end{align} \tag{9} $$

Note that the two error sums are each bounded independently of $x$:

$$ \left|\sum_{n=1}^{N} \left(x^{p_n} - x^{(1 \pm \epsilon)n\log n}\right)\right| \leq 2N. $$

Now, multiply $(9)$ by $(x-1)\log(1-x)$. Taking the limits infimum and supremum as $x \to 1^-$ and using $(8)$ yields

$$ \begin{align} \frac{1}{1+\epsilon} &\leq \liminf_{x \to 1^-} (x-1)\log(1-x) \sum_{n=1}^{\infty} x^{p_n} \\ &\leq \limsup _{x \to 1^-} (x-1)\log(1-x) \sum_{n=1}^{\infty} x^{p_n} \\ &\leq \frac{1}{1-\epsilon}. \end{align} $$

This is true for all $0 < \epsilon < 1$, so by allowing $\epsilon \to 0$ we obtain

$$ \lim_{x \to 1^-} (x-1)\log(1-x) \sum_{n=1}^{\infty} x^{p_n} = 1. $$

Thus, changing the notation of the sum back to $\sum_p x^p$,

$$ \sum_p x^p \sim \frac{1}{(x-1)\log(1-x)} \tag{10} $$

as $x \to 1^-$, which is what we wanted to show.