$f'(x)=f(x)$ and $f(0)=0$ implies that $f(x)=0$ formal proof

How can I prove that if a function is such that $f'(x)=f(x)$ and also $f(0)=0$ then $f(x)=0$ for every $x$. I have an idea but it's too long, I want to know if there is a simple way to do it. Thanks! Obviously in a formal way.


An implicit assumption is that the function is defined on some open interval containing $0$. Set

$$g(x)=e^{-x}f(x)$$

and compute the derivative:

$$g'(x)=-e^{-x}f(x)+e^{-x}f'(x)=-e^{-x}f(x)+e^{-x}f(x)=0$$

so the function $g$ is constant on the interval where it's defined. Since $g(0)=e^{-0}f(0)=0$ you can conclude that $g(x)=0$ for all $x$ and therefore also $f(x)=0$ for all $x$.


Without the initial assumption, you can get different functions with that property: define, for instance, $$f(x)=\begin{cases} 0 & \text{if $-1<x<1$}\\ e^x & \text{if $2<x<3$} \end{cases}$$ Then $f$ satisfies the requirements, but it's not constant.


We can solve differential equation $f'(x)-f(x)=0$. It's linear differential equation with constant coefficient, and we know how to solve it. Setting $f(x)=e^{\lambda x}$ we obtain $\lambda e^{\lambda x} - e^{\lambda x}=0$, or $\lambda - 1 =0$, which implies $\lambda = 1$. So our solution is $f(x)=c e^x$. But, we have condition $f(0)=0$, so $c e^0=0 \implies c=0$, so our final solution is $f(x)=0$.


Below I show how the standard proof (e.g. egreg's answer) is a special case of more general results on uniqueness theorems and Wronskians. Let's rewrite the proof in slightly more general form. Given that $\,f\,$ and $\,g = e^x\,$ are solutions of $\,y' = y\,$ we deduce

$$\begin{eqnarray}\color{#c00}{f'=f}\\ \color{#0a0}{g'=g}\end{eqnarray}\ \ \Rightarrow\ \ \left(\dfrac{f}g\right)' = \dfrac{\color{#c00}{f'}g-f\color{#0a0}{g'}}{g^2} = \dfrac{\color{#c00}fg-f\color{#0a0}g}{g^2} = 0\ \ \Rightarrow\ \dfrac{f}g = c\ \ \rm constant$$

Therefore $\ f = cg = ce^x\ $ so $\ 0 = f(0) = c,\ $ thus $\, f = 0.\, $ This generalizes. The same proof shows that, assuming appropriate differentiability/continuity conditions, if $\,f,g\,$ are solutions of $\, y' = h y\,$ on an interval $\,I\,$ where the Wronskian $\,W(f,g) := f'g - fg' = 0\,$ on $\,I,\,$ then $\,f,g\,$ are linearly dependent on $\,I,\,$ i.e. $\,c_1 f = c_2 g \,$ on $\,I\,$ for some constants $\,c_i.\,$

Thus, by a very simple proof, we've deduce a uniqueness theorem for solutions of linear first-order differential equations of said type. These ideas extend to analogous higher-order linear differential equations. See here for a proof of the second-order case (which generalizes to n'th order), using variation of parameters, and see here for the discrete analog of difference equations (recurrences). See also the classical result below on Wronskians and linear dependence.

Theorem $\ \ $ Suppose $\rm\:f_1,\ldots,f_n\:$ are $\rm\:n-1\:$ times differentiable on interval $\rm\:I\subset \mathbb R\:$ and suppose they have Wronskian $\rm\: W(f_1,\ldots,f_n)\:$ vanishing at all points in $\rm\:I\:.\:$ Then $\rm\:f_1,\ldots,f_n\:$ are linearly dependent on some subinterval of $\rm\:I\:.$

Proof $\ $ See this answer.


Here is a different approach, more suited for calculus students who do not know how to solve differential equations yet.

It's easy to prove by induction that since $f'=f$, the function $f$ must have derivatives of all orders and $f = f' = f'' = ...$ for all $x$, and hence $f(0) = f'(0) = f''(0) = \ldots$.

Then, by Taylor's Theorem, we expand $f(x)$ in series around 0 to get

$$ f(x) = \sum_{n=0}^\infty \frac{x^n}{n!} f^{(n)}(0) = \sum_{n=0}^\infty \frac{x^n}{n!} \cdot 0 = 0. $$