Does $\lim \frac {a_n}{b_n}=1$ imply $\lim \frac {f(a_n)}{f(b_n)}=1$?

I wanted to prove the seemingly simple statement:

If $\lim \frac {a_n}{b_n}=1$ and $f$ continuous with $f(b_n)\neq0$ then $\lim \frac {f(a_n)}{f(b_n)}=1.$

I started promptly with

\begin{align} \\ \lim \frac {f(a_n)}{f(b_n)} &= \lim \frac { f(b_n \times \frac{a_n}{b_n})}{f(b_n)} \\ &= \frac { f(b_n \times \lim\frac{a_n}{b_n})}{f(b_n)} \\ &= \frac { f(b_n \times 1)}{f(b_n)} \\ &= 1 \end{align}

Yet two seconds later I realized what a nonsense it was and that I fell victim of one of the freshman's dreams.

I would greatly appreciate a hint for a proof or a counterexample if the statement turns out to be false.


Solution 1:

Take $a_n= n+1, b_n =n$ and $f(x)=e^x$ for a counter-example.

Solution 2:

This is not true. Let $a_n = n + \log n$, $b_n = n$, $f(x) = e^x$. Then $\lim \frac{a_n}{b_n} = 1$, but $\frac{f(a_n)}{f(b_n)} = e^{\log n} = n$ does not converge to 1. This is an issue with uniform continuity, I think.

Edit: Scratch that, uniform continuity is not sufficient either. But if $f$ is continuous and $a_n, b_n$ are bounded, and $f$ is bounded away from zero, it may be possible, but that's a lot of conditions.

Solution 3:

A counterexample with bounded $f$:

$$a_n = n\pi,\quad b_n = \left(n+\frac12\right)\pi, \quad f(x) = \sin x$$


Of course, as soon as $\lim\limits_{n\to\infty} a_n$ exists, so does $\lim\limits_{n\to\infty} b_n$, whence:

$$\lim_{n\to\infty} \frac{a_n}{b_n} = 1 \iff \lim_{n\to\infty} a_n = \lim_{n\to\infty} b_n$$

and from this last identity, we easily derive (provided $\liminf\limits_{n\to\infty} |f(b_n)| > 0$; thank you Hagen von Eitzen):

$$\lim_{n \to \infty} \frac{f(a_n)}{f(b_n)} = 1$$


Moreover, if $a_n,b_n$ are bounded, by $C$, say, then:

\begin{align} & &\left|\frac{a_n}{b_n} - 1\right| &< \epsilon \\ \iff& &\left|a_n - b_n\right| &< |b_n|\epsilon < C \epsilon \end{align}

which means that if since $f$ is uniformly continuous on $[-C,C]$, we can ensure $|f(a_n)-f(b_n)| < \epsilon_2$. (Here, $C$ could be replaced by anything exceeding the maximum of $\limsup |a_n|$ and $\limsup |b_n|$.)

If additionally $\inf\limits_{n \in \Bbb N}|f(b_n)| > 0$, we obtain $D = \sup\limits_{n\in \Bbb N} \dfrac1{|f(b_n)|} < \infty$, which means:

$$\left|\frac{f(a_n)}{f(b_n)}-1\right| < \frac{\epsilon_2}{|f(b_n)|} < \epsilon D$$

Combining all of this together allows us to infer the desired conclusion, that:

$$\lim_{n \to \infty} \frac{f(a_n)}{f(b_n)} = 1$$


NB. As nik points out, if $f$ is continuous and defined on $[-C,C]$, then it will be uniformly continuous due to the Heine-Cantor theorem.