Let $f'(x) \geq 0$ for any $x$. Also, suppose that if $f'(x)=0$, then $x$ is an isolated zero of $f'$. Given this, $f$ is strictly increasing.

Claim to prove:

Suppose $f'(x) \geq 0$ for any $x$. Further, suppose that if $f'(x)=0$, then $x$ is an isolated zero of $f'$. Given these two conditions, $f$ must be a strictly increasing function.

We will make use of the following lemma:

If $f: \mathbb R \to \mathbb R$ and $f$ is locally strictly increasing for any $x \in \mathbb R$, then $f$ is a strictly increasing function. $\quad (\dagger)$


Let $a$ be any element where $f'(a) \gt 0 \iff \displaystyle \lim_{x \to a}\frac{f(x)-f(a)}{x-a} \gt 0$. This implies that there is some punctured neighborhood around $(a-\delta_a,a+\delta_a)\setminus \{a\}$ such that for any $x$ in this punctured neighborhood: $\frac{f(x)-f(a)}{x-a} \gt 0 \quad (*)$. Suppose that $x$ is to the left of $a$. Therefore, $x-a \lt 0$. In order to satisfy $(*)$, $f(x)-f(a) \lt 0$...i.e. $f(x) \lt f(a)$. Similarly, if $x$ is to the right of $a$, then $f(x) \gt f(a)$. Therefore, on $(a-\delta_a,a+\delta_a)$, $f$ is strictly increasing.

Next, suppose $a$ is any element where $f'(a)=0$. Because $a$ is an isolated zero of $f'$, there is some punctured neighborhood $(a-\delta_a,a+\delta_a)\setminus \{a\}$ such that for any $f'(x) \neq 0$. If $f'(x)\neq 0$, then, by assumption, $f'(x)\gt 0$. Therefore on the intervals $(a-\delta_a,a)$ and $(a,a+\delta_a)$ $f$ is necessarily strictly increasing. Continuity at $a$ allows us to stich these two intervals together: $f$ must be strictly increasing on $(a-\delta_a,a+\delta_a)$.

The above two paragraphs have demonstrated that for any $x \in \mathbb R$ there is a neighborhood around $x$ where $f$ is strictly increasing. Applying $(\dagger)$, we find that $f$ is a strictly increasing function. $\quad \square$


Any alternative approaches would be appreciated!


Solution 1:

Here's an alternative. Since $f'(x)\ge 0$, we know that $f$ is (weakly) increasing. Aiming for a contradiction, suppose $f$ is not strictly increasing. This means there are numbers $a<b$ such that $f(a)\ge f(b)$. But $f(a)\le f(b)$ since $f$ is increasing. Hence, $f(a)=f(b)$ and since $f$ is monotonic, $f(x)=f(a)=f(b)$ for all $x\in[a,b]$. So we found a non-trivial interval where $f$ is constant. Its derivative there is $0$. Contradiction.