Proof verification : $f$ is increasing at $x_0$.

Q. Let $f:(a,b) \rightarrow \mathbb R$ and $x_0 \in (a,b)$. $f$ is differentiable function. Determine the relationship of following statements.

(a) $f^\prime (x_0) \gt 0$

(b) $f$ is increasing at $x_0$

(c) $f$ is increasing function on some intervals that contain $x_0$.

My proof

(a) $\rightarrow$ (b)

$\lim_{h \to 0} {{f(x_0 +h) - f(x_0)} \over h} = \lim_{h \to 0+} {{f(x_0 +h) - f(x_0)} \over h} = \lim_{h \to 0-} {{f(x_0 +h) - f(x_0)} \over h} = \alpha \gt 0$

Therefore, there exists $\delta \gt 0$ s.t. $0 \lt h \lt \delta \Rightarrow f(x_0+h) - f(x_0) \gt 0$ and $-\delta \lt h \lt 0 \Rightarrow f(x_0+h) - f(x_0) \lt 0$. So $f$ is increasing at $x_0$.

(b) $\rightarrow$ (c)

Using the above notation of $\delta$, it follows that $f$ is increasing function on $N(x_0, \delta)$.

(c) $\rightarrow$ (a)

If $f$ is increasing function on $N(x_0, \delta)$, then $\lim_{h \to 0+} {{f(x_0 +h) - f(x_0)} \over h} = \lim_{h \to 0-} {{f(x_0 +h) - f(x_0)} \over h} \gt 0$.

Therefore $f^\prime (x_0) \gt 0$.

(a), (b), (c) is equivalent.

I heard that (b) $\rightarrow$ (c) can be false in some functions, but as $f$ is differentiable, I think it doesn't matter. Did I proved it right? Do I have to add some more?


The only valid relationship is $(a) \Rightarrow (b) $ (apart from the trivial implication $(c) \Rightarrow (b) $ pointed out by user "edm" via comment). There exist counter-examples to show that other implications do not hold. Also note the meaning of "increasing at a point" and "increasing on an interval":

A function $f$ defined on a certain neighborhood $I$ of $c$ is said to be increasing at point $c$ if there is another neighborhood $J$ of $c$ such that $x\in J, x<c\Rightarrow f(x) \leq f(c) $ and $x\in J, x>c\Rightarrow f(x) \geq f(c) $.

A function $f$ defined on an interval $I$ is said to be increasing on interval $I$ if $x, y\in I, x<y\Rightarrow f(x) \leq f(y) $.

We have similar definitions for strictly increasing if the weak inequalities used above are replaced by their stronger versions. By the above definitions it is clear that if $f$ is increasing at $c$ then it does not necessarily imply that $f$ is increasing in some neighborhood of $c$.

To understand the differences in above concepts clearly, we use a very famous example from Hardy's masterpiece A Course of Pure Mathematics $$f(x) =x^{2}\sin(1/x)+kx, f(0)=0$$ where $k$ is some fixed number in interval $(0,1)$. We have $$f'(x) =2x\sin(1/x)-\cos(1/x)+k,f'(0)=k>0$$ Since $f'(0)$ is positive it can be easily proved (using definition of derivative as a limit) that $f$ is (strictly) increasing at $0$. But as $x\to 0$, the derivative $f'(x) $ oscillates between $k-1<0$ and $k+1>0$ so that every neighborhood of $0$ contains points at which derivative is negative and hence $f$ cannot be increasing in any such neighborhood. This shows that $(b) $ of your question does not necessarily imply $(c) $.

Next consider the function $f(x) =x^{3}$. Then $f$ is increasing at $0$ as well as increasing in every interval and yet $f'(0)=0$ so that the implication $(b) \Rightarrow (a) $ as well as implication $(c) \Rightarrow (a) $ does not hold in general.


You should be able to figure out why your argument for implication $(b) \Rightarrow (c) $ as well as for implication $(c) \Rightarrow (a) $ is wrong.


It is interesting to note the following relation between $(b) $ and $(c) $ (mentioned in the links by user Dave L. Renfro via comment, also given as a solved exercise in Hardy's book mentioned earlier):

If $f$ is increasing at every point of an interval $I$, then $f$ is increasing in interval $I$.

The proof of the above result depends on completeness of real numbers and although the result may seem obvious the proof is not that obvious.