Prove that $C\exp(x)$ is the only set of functions for which $f(x) = f'(x)$
Solution 1:
Of course $C e^x$ has the same property for any $C$ (including $C = 0$). But these are the only ones.
Proposition: Let $f : \mathbb{R} \to \mathbb{R}$ be a differentiable function such that $f(0) = 1$ and $f'(x) = f(x)$. Then it must be the case that $f = e^x$.
Proof. Let $g(x) = f(x) e^{-x}$. Then
$$g'(x) = -f(x) e^{-x} + f'(x) e^{-x} = (f'(x) - f(x)) e^{-x} = 0$$
by assumption, so $g$ is constant. But $g(0) = 1$, so $g(x) = 1$ identically.
N.B. Note that it is also true that $e^{x+c}$ has the same property for any $c$. Thus there exists a function $g(c)$ such that $e^{x+c} = g(c) e^x = e^c g(x)$, and setting $c = 0$, then $x = 0$, we conclude that $g(c) = e^c$, hence $e^{x+c} = e^x e^c$.
This observation generalizes to any differential equation with translation symmetry. Apply it to the differential equation $f''(x) + f(x) = 0$ and you get the angle addition formulas for sine and cosine.
Solution 2:
Let $f(x)$ be a differentiable function such that $f'(x)=f(x)$. This implies that the $k$-th derivative, $f^{(k)}(x)$, is also equal to $f(x)$. In particular, $f(x)$ is $C^\infty$ and we can write a Taylor expansion for $f$:
$$T_f(x) = \sum_{k=0}^\infty c_k x^k.$$
Notice that the fact that $f(x)=f^{(k)}(x)$, for all $k\geq 0$, implies that the Taylor series $T_f(x_0)$ converges to $f(x_0)$ for every $x_0\in \mathbb{R}$ (more on this later), so we may write $f(x)=T_f(x)$. Since $f'(x) = \sum_{k=0} (k+1)c_{k+1}x^k = f(x)$, we conclude that $c_{k+1} = c_k/(k+1)$. The value of $c_0 = f(0)$, and therefore, $c_k = f(0)/k!$ for all $k\geq 0$. Hence:
$$f(x) = f(0) \sum_{k=0}^\infty \frac{x^k}{k!} = f(0) e^x,$$
as desired.
Addendum: About the convergence of the Taylor series. Let us use Taylor's remainder theorem to show that the Taylor series for $f(x)$ centered at $x=0$, denoted by $T_f(x)$, converges to $f(x)$ for all $x\in\mathbb{R}$. Let $T_{f,n}(x)$ be the $n$th Taylor polynomial for $f(x)$, also centered at $x=0$. By Taylor's theorem, we know that $$|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0 - 0|^{n+1}}{(n+1)!},$$ where $R_n(x_0)=f(x) - T_{f,n}(x)$ and $\xi$ is a number between $0$ and $x_0$. Let $M=M(x_0)$ be the maximum value of $|f(x)|$ in the interval $I=[-|x_0|,|x_0|]$, which exists because $f$ is differentiable (therefore, continuous) in $I$. Since $f(x)=f^{(n+1)}(x)$, for all $n\geq 0$, we have: $$|R_n(x_0)|\leq |f^{(n+1)}(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq |f(\xi)|\frac{ |x_0|^{n+1}}{(n+1)!}\leq M \frac{|x_0|^{n+1}}{(n+1)!} \longrightarrow 0 \ \text{ as } \ n\to \infty.$$ The limit goes to $0$ because $M$ is a constant (once $x_0$ is fixed) and $A^n/n! \to 0$ for all $A\geq 0$. Therefore, $T_{f,n}(x_0) \to f(x_0)$ as $n\to \infty$ and, by definition, this means that $T_f(x_0)$ converges to $f(x_0)$.
Solution 3:
Yet another way: By the chain rule, ${\displaystyle {d \over dx} \ln|f(x)| = {f'(x) \over f(x)} = 1}$. Integrating, you get $\ln |f(x)| = x + C$. Taking $e$ to both sides, you obtain $|f(x)| = e^{x + C} = C'e^x$, where $C' > 0$. As a result, $f(x) = C''e^x$, where $C''$ is an arbitrary constant.
If you are worried about $f(x)$ being zero, the above shows $f(x)$ is of the form $C''e^x$ on any interval for which $f(x)$ is nonzero. Since $f(x)$ is continuous, this implies $f(x)$ is always of that form, unless $f(x)$ is identically zero (in which case we can just take $C'' = 0$ anyhow).
Solution 4:
Hint $\rm\displaystyle\:\ \begin{align} f{\:'}\!\! &=\ \rm a\ f \\ \rm \:\ g'\!\! &=\ \rm a\ g \end{align} \iff \dfrac{f{\:'}}f = \dfrac{g'}g \iff \bigg(\!\!\dfrac{f}g\bigg)' =\ 0\ \iff W(f,g) = 0\:,\ \ W = $ Wronskian
This is a very special case of the uniqueness theorem for linear differential equations, esp. how the Wronskian serves to measure linear independence of solutions. See here for a proof of the less trivial second-order case (that generalizes to n'th order). See also the classical result below on Wronskians and linear dependence from one of my sci.math posts on May 12, 2003.
Theorem $\ \ $ Suppose $\rm\:f_1,\ldots,f_n\:$ are $\rm\:n-1\:$ times differentiable on interval $\rm\:I\subset \mathbb R\:$ and suppose they have Wronskian $\rm\: W(f_1,\ldots,f_n)\:$ vanishing at all points in $\rm\:I\:.\:$ Then $\rm\:f_1,\ldots,f_n\:$ are linearly dependent on some subinterval of $\rm\:I\:.$
Proof $\ $ We employ the following easily proved Wronskian identity:
$\rm\qquad\ W(g\ f_1,\ldots,\:g\ f_n)\ =\ g^n\ W(f_1,\ldots,f_n)\:.\ $ This immediately implies
$\rm\qquad\quad\ \ \ W(f_1,\ldots,\: f_n)\ =\ f_1^{\:n}\ W((f_2/f_1)',\ldots,\:(f_n/f_1)'\:)\quad $ if $\rm\:\ f_1 \ne 0 $
Proceed by induction on $\rm\:n\:.\:$ The Theorem is clearly true if $\rm\:n = 1\:.\:$ Suppose that $\rm\: n > 1\:$ and $\rm\:W(f_1,\ldots,f_n) = 0\:$ for all $\rm\:x\in I.\:$ If $\rm\:f_1 = 0\:$ throughout $\rm\:I\:$ then $\rm\: f_1,\ldots,f_n\:$ are dependent on $\rm\:I.\:$ Else $\rm\:f_1\:$ is nonzero at some point of $\rm\:I\:$ so also throughout some subinterval $\rm\:J \subset I\:,\:$ since $\rm\:f_1\:$ is continuous (being differentiable by hypothesis). By above $\rm\:W((f_2/f_1)',\ldots,(f_n/f_1)'\:)\: =\: 0\:$ throughout $\rm\:J,\:$ so by induction there exists a subinterval $\rm\:K \subset J\:$ where the arguments of the Wronskian are linearly dependent, i.e.
on $\rm\ K:\quad\ \ \ c_2\ (f_2/f_1)' +\:\cdots\:+ c_n\ (f_n/f_1)'\: =\ 0,\ \ $ all $\rm\:c_i'\:=\ 0\:,\ $ some $\rm\:c_j\ne 0 $
$\rm\qquad\qquad\: \Rightarrow\:\ \ ((c_2\ f_2 +\:\cdots\: + c_n\ f_n)/f_1)'\: =\ 0\ \ $ via $({\phantom m})'\:$ linear
$\rm\qquad\qquad\: \Rightarrow\quad\ \ c_2\ f_2 +\:\cdots\: + c_n\ f_n\ =\ c_1 f_1\ \ $ for some $\rm\:c_1,\ c_1'\: =\: 0 $
Therefore $\rm\ f_1,\ldots,f_n\:$ are linearly dependent on $\rm\:K \subset I\:.\qquad$ QED
This theorem has as immediate corollaries the well-known results that the vanishing of the Wronskian on an interval $\rm\: I\:$ is a necessary and sufficient condition for linear dependence of
$\rm\quad (1)\ $ functions analytic on $\rm\: I\:$
$\rm\quad (2)\ $ functions satisfying a monic homogeneous linear differential
equation
$\rm\quad\phantom{(2)}\ $ whose coefficients are continuous throughout $\rm\: I\:.\:$
Solution 5:
Here is a different take on the question. There is a whole spectrum of different discrete "calculi" which converge to the continuous case, each of which has it's special "$e$".
Pick some $t>0$. Consider the equation $$f(x)=\frac{f(x+t)-f(x)}{t}$$ It is not hard to show by induction that there is a function $C_t:[0,t)\to \mathbb{R}$ so that $$f(x)=C_t(\{\frac{x}{t}\})(1+t)^{\lfloor\frac{x}{t}\rfloor}$$ where $\{\cdot\}$ and $\lfloor\cdot\rfloor$ denote fractional and integer part, respectively. If I take Qiaochu's answer for comparison, then $C_t$ plays the role of the constant $C$ and $(1+t)^{\lfloor\frac{x}{t}\rfloor}$ the role of $e^x$. Therefore for such a discrete calculus the right value of "$e$" is $(1+t)^{1/t}$. Now it is clear that as $t\to 0$ the equation becomes $f(x)=f'(x)$, and $(1+t)^{1/t}\to e$.