Let $f(x)$ and $g(x)$ be positive nondecreasing functions such that $ \sum_{n>1} \frac1{f(n)} \text{ and } \sum_{n>1} \frac1{g(n)} $

diverges.

(Why) must the series $$\sum_{n>1} \frac1{g(n)+f(n)}$$ diverge?


Solution 1:

No, $\sum\frac{1}{f(n)+g(n)}$ need not diverge. We define such a pair $f,g$ inductively. Let $f(1) = g(1) = 2$.

Suppose $f(n)$ and $g(n)$ have been defined for $1\leq n\leq N$. Let $L$ be the largest value taken by $f(n)$ or $g(n)$ for $1\leq n\leq N$. Define $f(N+1) = f(N+2) = \cdots = f(N + L) = L$ and $g(N+i) = 2^iL$ for $1\leq i\leq L$. Define $g(N+L+1) = \cdots = g(N+L+2^LL) = 2^LL$ and $f(N+L+i) = 2^{L+i}L$ for $i\leq 1\leq 2^LL$. The extends the domain of definition of $f(n)$ and $g(n)$ to $1\leq n\leq N+L+2^LL$. Repeat to define $f$ and $g$ on all of $\mathbb{N}$.

Note that $f$ and $g$ are positive and nondecreasing. In each stage of this process there are $L$ times when $f$ takes the value $L$ and $2^LL$ times when $g$ takes the value $2^LL$. Therefore $\sum_{n=N+1}^{N+L+2^LL} \frac{1}{f(n)} > 1$ and $\sum_{n=N+1}^{N+L+2^LL} \frac{1}{g(n)} > 1$: at least $1$ is added to the sum of the reciprocals of each series at each step. Since we repeat this process ad infinitum, adding at least $1$ to each series at each step, $\sum_{n=1}^\infty \frac{1}{f(n)}$ and $\sum_{n=1}^\infty \frac{1}{g(n)}$ both diverge.

Note that by construction, $\max(f(n+1),g(n+1)) = 2\max(f(n),g(n))$ for all $n$. Since $f(1)=g(1)=2$, we have $\max(f(n),g(n))=2^n$ for all $n$. Therefore $\frac{1}{f(n)+g(n)}\leq \frac{1}{\max(f(n),g(n))} = 2^{-n}$ for all $n$, so $\sum_{n=1}^\infty\frac{1}{f(n)+g(n)}$ converges by comparison with the geometric series $2^{-n}$.

Solution 2:

Maybe obvious remark, but too long to put as a comment (honestly, I've tried). I put it here until the moment the problem will be solved.

I was trying to go the same way as Gerry suggested (I hope he meant $\min$ in his answer), i.e. $a_n=\frac{1}{f_n}$, $b_n = \frac{1}{g_n}$ are non-increasing positive sequences such that $$ \sum_na_n = \sum_n b_n = \infty $$ and the question is if $$ \sum_n c_n = \sum_n\min\{a_n,b_n\}=\infty. $$ It's only sufficient of course, so if one will find a counterexample for this problem it's not necessary a solution for the original problem.

However, for a counterexample to the original problem (if one exists) we should have $a_n\geq b_n$ and $b_n\geq a_n$ infinitely many times, or equivalently $f_n\geq g_n$ and $g_n\geq f_n$ infinitely many times. If it does not hold than the residual of sum $\sum_n c_n$ consists either only of $a_n$ or of $b_n$ and hence diverges.