Prove there exists $a\in \Bbb{R}$ such that $f'(a)=0$.

Let $f$ be differentiable on $\Bbb{R}$ and let $\lim_\limits{x\to \infty}f(x)=\lim_\limits{x\to -\infty}f(x)=0$. Prove there exists $a\in \Bbb{R}$ such that $f'(a)=0$.

Attempt: If $f$ is constant we are done. Otherwise, $f$ is not constant. Suppose there is no such $a$. Then, in particular, $f$ has no extremum $\implies$ $f$ has no point where is goes from increasing to decreasing or from decreasing to increasing. $f$ is continuous since it is differentiable. Hence, since $\lim_\limits{x\to \infty}f(x)=\lim_\limits{x\to -\infty}f(x)=0$, $f$ must be constant. A contradiction. Therefore either $f$ is constant and $f'(x)=0$ everywhere, or $f$ has an extremum at $x=a$ and $f'(a)=0$.

I feel like my attempt is not formal enough, any correction? I would appreciate your help.


Solution 1:

Define the function $g$ on $\left(-\frac{\pi}2,\frac{\pi}2\right)$ by $g(x)=f(\tan x)$ then by the hypothesis we can extend $g$ by continuity on a function $\tilde g$ continuous on $\left[-\frac{\pi}2,\frac{\pi}2\right]$ and differentiable on $\left(-\frac{\pi}2,\frac{\pi}2\right)$ such that $\tilde g(-\pi/2)=\tilde g(\pi/2)$ so by the Rolle's theorem there's $\alpha\in \left(-\frac{\pi}2,\frac{\pi}2\right)$ such that

$$g'(\alpha)=f'(\tan \alpha)\underbrace{(1+\tan^2(\alpha))}_{\ne0}=0\implies f'(\tan\alpha)=0$$ so take $a=\tan\alpha$ to conclude.

Solution 2:

Assume $f$ is non-constant. Then we adopt the notation $f(a)=b$. Without loss of generality, assume $b>0$--remember ${d\over dx}(-f(x))=-{d\over dx}(f(x))$ so the function, $f$ has a zero derivative if and only if $-f$ has a zero derivative at the same point. Then select $0<\epsilon<b$. We know by continuity of $f$ that $\exists x_1>a$ such that $f(x_1)=\epsilon$ by the intermediate value theorem since, by definition

$$\lim_{x\to\infty}f(x)=0\iff \text{for every }\epsilon >0 \text{ there is an }N\text{ such that if }x>N,\; f(x)<\epsilon.$$

and since $f$ takes a value less than $\epsilon$ and greater than it (i.e. $b$) it must take on the value $\epsilon$ somewhere.

Similarly there is $x_2<a$ so that $f(x_2)=\epsilon$.

Then by the mean value theorem, there is a $x_2 < c < x_1$ such that

$$f'(c)={f(x_1)-f(x_2)\over x_1-x_2}={\epsilon -\epsilon\over x_1-x_2}=0.$$