Showing $f'(x) = f(x)$ implies an exponential function [duplicate]

Solution 1:

Let $$G(x)=\frac{f(x)}{e^x}.$$ Differentiate, and use $f'(x)=f(x)$ to conclude $G'(x)$ is identically $0$.

Then either quote the theorem that says that a function whose derivative is identically $0$ is constant. Or else alternately prove that theorem, by using the Mean Value Theorem.

Note that more or less the same proof shows that if $f'(x)=kf(x)$, where $k$ is a constant, then $f(x)=Ce^{kx}$ for some constant $C$.

Solution 2:

let $g(x) = f(x) \cdot\exp(-x)$, then you have for each $x \in \mathbb R$: \begin{align*} g'(x) &= f'(x)\exp(-x) - f(x)\exp(-x)\\ &= f(x)\exp(-x) - f(x)\exp(-x)\\ &= 0. \end{align*} So $g$ is constant.

AB,

Solution 3:

Really, integration is antiderivative, which means if you know how to do derivatives, you know how to do integrals. So, I'm not exactly sure what you mean by not using integrals. For example, this is an easy Calc 1 proof:

$\begin{align*} f'(x) &= f(x) \\ \frac{f'(x)}{f(x)} &= 1 \\ \left(\ln (f(x))\right)' &= 1 \\ \ln (f(x)) &= x + C \\ f(x) &= e^{x + C} = a e^x \end{align*}$

So, did I use integration? Or did I just have the knowledge that the only function whose derivative is 1 must be of the form $x + C$, for a constant $C$? All you need to know that is $\frac{d}{dx} x = 1$ and the following theorem:

If $F'(x) = G'(x)$ for all $x$ in $(a, b)$, then there is a constant $C$ such that $F(x) = G(x) + C$ for all $x$ in $(a, b)$.

This is given in the section on the Mean Value Theorem for Derivatives in Varburg, Purcell, and Rigdon, before antiderivatives or integrals are introduced. And, the proof, uses the MVT, and no theory on integration.

Solution 4:

Just for fun (using the "Taylor approach"):

Suppose $f_1$ and $f_2$ are two functions both of which satisfy the conditions $$\tag{1} f'(x)=f(x),\quad \text{for all }\ x\in\Bbb R $$ and $$\tag{2} f(0)=1. $$ Let $F=f_1-f_2$. One easily verifies that $F$ satisfies $(1)$ and $F(0)=0$. One also easily verifies, by induction, that $F^{(n)}(x)=F(x)$ for all positive integers $n$ and all $x\in\Bbb R$.

Now, fix $x\in\Bbb R$. Let $I_x$ be the closed interval with endpoints $0$ and $x$. By Taylor's Theorem we have for $n$ a fixed positive integer: $$ F(x)=F(0)+{F'(0)\over 1!}x +\cdots+{F^{(n-1)}(0)\over (n-1)!}x^{n-1} +{F^{(n )}(c_n)\over n !}x^{n } $$ for some $c_n$ between $0$ and $x$.

But $F^{(k)}(0)=F(0)=0$ for each $1\le k\le n-1$; so, $$\tag{3} F(x) = {F^{(n )}(c_n)\over n !}x^{n }={F(c_n)\over n!}x^n. $$ Since $F$ is continuous on $I_x$, it follows from $(3)$ that $$\tag{4} |F(x) |\le {K\over n !}|x|^{n }, $$ for some constant $K$ independent of $n$.

As $n$ was arbitrary it follows from $(4)$ that $F(x)=0$.

As $x$ was arbitrary, it follows that $f_1=f_2$.

From this it follows that the unique function satisfying $(1)$ and $(2)$ is $f(x)=e^x$; whence the unique function satisfying $f'=f$ and $f(0)=a$ is $f(x)=ae^x$.