$f'(c) \ge 0 , \forall c \in (a,b)$ then $f$ is increasing in $[a,b]$ , proof of this without Mean Value theorem

Let $f: [a,b] \to \mathbb R$ be a function differentiable in $(a,b)$ , it is known that if $f'(c) \ge 0 , \forall c \in (a,b)$ then $f$ is increasing in $[a,b]$ and this can be proved by Lagrange Mean value theorem ; I would like to know , is there any other proof of this ?


Solution 1:

Assume that there are points $c$, $d$ with $a<c<d<b$ such that $${f(d)-f(c)\over d-c}=-p<0\ .$$ I claim that there is a point $\xi\in[c,d]$ with $f'(\xi)\leq-p$.

Proof. Using binary division we can find an increasing sequence $(c_n)_{n\geq0}$ and a decreasing sequence $(d_n)_{n\geq0}$ with $$c\leq c_n<d_n\leq d,\qquad d_n-c_n={d-c\over2^n}\qquad(n\geq0)\ ,$$ such that $${f(d_n)-f(c_n)\over d_n-c_n}\leq -p\qquad(n\geq0)\ .\tag{1}$$ The $c_n$ and the $d_n$ have a common limit point $\xi\in[c,d]$. If $$\xi=c_n\qquad(n\geq n_0)$$ one immediately concludes from $(1)$ that $f'(\xi)\leq-p$. Otherwise we may assume $c_n<\xi<d_n$ for all $n\geq0$. Rewriting the left hand side of $(1)$ we then can say say that $${\xi-c_n\over d_n-c_n}{f(\xi)-f(c_n)\over \xi-c_n}+{d_n-\xi\over d_n-c_n}{f(d_n)-f(\xi)\over d_n-\xi}\leq -p\qquad(n\geq0)\ .\tag{2}$$ Now let an $\epsilon>0$ be given. Then for some sufficiently large $n$ the difference quotients of $f$ in $(2)$ are $\geq f'(\xi)-\epsilon$, and this implies $$f'(\xi)-\epsilon\leq -p\ .$$ Since this is true for all $\epsilon>0$ the claim follows.

Solution 2:

To anyone reading this answer: it is somewhat wrong. Please check comments.

If I may assume $f'(c)>0$:

For every $x \in [a,b]$ we have an interval $I_x$ which by definition of the limit (the derivative), we have $f$ increasing on $I_x$ *. Then $\bigcup_{x \in [a,b]} I_x=[a,b]$, and since $[a,b]$ is compact we can find a finite subcover $\bigcup_{i=1,...,n} I_{x_i}=[a,b]$. $f$ is increasing on each interval $I_{x_i}$, and for any two points $p<q$ with $p,q \in [a,b]$ we can find intermediate points $r_j$ ($r_1=p$,...,$r_m=q$) such that $\forall j$, $r_j$ and $r_{j+1}$ are both in $I_{x_i}$ for some $i$. Then, $\forall j$, we have $f(r_j)<f(r_{j+1})$ and hence $f(p)<f(q)$.

*Suppose $f'(x)=c>0$. Then $\exists \delta$ such that $\frac{f(y)-f(x)}{y-x}>\frac{c}{2}>0$ for $|y-x|<\delta$. Define $I_x$ to be the interval $(x-\frac{\delta}{2},x+\frac{\delta}{2})$. Then $f$ strictly increasing on $I_x$.

Method 2, with $f'(c) \geq 0$. Every assumption used here can be proven without use of the MVT.

Simply note that $f(x)=\int_a^xf'(t) \mathrm{dt}$. For $y>x$, $f(y)-f(x)=\int_x^yf'(t)\mathrm{dt}$. Since $f'(t) \geq 0$ for $t \in [x,y]$ we have $f(y)-f(x) \geq 0$.