$f'(x) = [f(x)]^{2}$. Prove $f(x) = 0 $

Let $f:\mathbb{R} \rightarrow \mathbb{R}$ be a differentiable function such that $f(0) = 0$ and $f'(x) = [f(x)]^{2}$, $\forall x \in \mathbb{R}$. Show that $f(x) = 0$, $\forall x \in \mathbb{R} $.

I first (unsuccessfully) tried using the Mean Value Theorem, but this is in the Integrals chapter so the solution probably involves them. Can't really see where integrals come in here though.

What I've got so far:

(i) Since $f$ is differentiable, thus it is continuous and, hence, integrable. Therefore $f^2$ is also integrable and as $f'=f^2$, $f'$ is too.

(ii) $f' \geq 0$, $\forall x \in \mathbb{R}$


Solution 1:

Just to give an MVT proof, let's assume $f(b)\not=0$ for some $b$. Without loss of generality*, we can assume $0\lt b\lt1$ and $0\lt f(c)\lt f(b)\lt 1$ for all $0\lt c\lt b$. By MVT, there is a $c\in(0,b)$ such that

$$f'(c)={f(b)-f(0)\over b-0}={f(b)\over b}$$

But our various hypotheses now tell us

$$f'(c)=(f(c))^2\lt f(c)\lt f(b)\lt{f(b)\over b}$$

which is a contradiction. So there is no $b$ for which $f(b)\not=0$.

*Replacing $f$ with $g(x)=-f(-x)$, which satisfies $g(0)=0$ and $g'(x)=(g(x))^2$, shows we can assume $b$ is positive. Replacing $f$ with $g(x)=f(x+a)$ where $0\lt a\lt b$ and $f(a)=0$, shows we can assume $f$ is nonzero on $(0,b)$, at which point we can assume $b\lt1$. The hypothesis $f'(x)=(f(x))^2$ now shows $f$ is increasing on $(0,b)$, hence strictly positive (by MVT, if you like), at which point the Intermediate Value Theorem allows us to assume $f(b)\lt1$.

Solution 2:

For a simple ad hoc proof see Barry Cipra's answer.

A more theoretical proof of the stated claim goes as follows: The differential equation $$y'=y^2\qquad\bigl(=:F(x,y)\bigr)\tag{1}$$ satisfies the assumptions of the local existence and uniqueness theorem for ODEs at all points $(x,y)\in{\mathbb R}^2$. By standard theorems about the "maximal solutions" it follows that the solution of any IVP $\bigl((1), (x_0,y_0)\bigr)$ is globally uniquely determined, and its graph finally leaves any given bounded set $B\subset{\mathbb R}^2$. In the case at hand the given initial point is $(0,0)$, and we can guess the global solution $$f(x)=0\qquad(-\infty<x<\infty)\ ,$$ which therefore is the only solution of the original problem.

Solution 3:

This is an improvement over one of the approaches given by user Christian Blatter.


Let us assume that there is a point $a\neq 0$ such that $f(a) \neq 0$. Let's first handle the case when $f(a) >0$. Note that $f'(x) =\{f(x) \} ^{2}\geq 0$ so $f$ is increasing and since $f(0)=0$ it now follows that $a>0$. And if $x\geq a$ then $f(x) \geq f(a) >0$. Consider the function $$g(x) = x+\frac{1}{f(x)} $$ which is well defined on $[a, \infty) $ and $$g'(x) =1-\frac{f'(x)}{\{f(x)\}^{2}}=0$$ and hence $g$ is constant on $[a, \infty) $. But this is an obvious contradiction as $g(x) >x$ for all $x\geq a$.

If $f(a) <0$ then by increasing nature of $f$ we must have $a<0$ and we can apply a similar argument as in last paragraph on the interval $(-\infty, a] $ to get a contradiction.

It thus follows that $f(x) =0$ for all $x$.