Why is the formal solution to a linear differential equation of exponential form?
Solution 1:
Revised upon reading OPs comments Your specific questions: 1. There is no other function like that. Only the exponential satisfies $f'=f$, non trivially (of course $f=0$ also works). 2. You are confusing exponents with repeated differentiation notation. If $f=e^{zx}$ where z is a constant, then $f'=ze^{zx}, f''=z(ze^{zx})=z^2e^{zx}$ each diffentiation brings down an additional z from the exponenet and multiplies it by the existing function. That is why you get that.
After reading your comments, I think I have a better handle on where your confusion lies.
First, note that the exponential is a general solution only for linear differential equations with constant coefficients. If you have variable coefficients, then all bets are off, as you can get Bessel functions and other non-exponential solutions.
With that said, we are focusing on cases like the following: $af''+bf'+cf=0$
Now, what is this equation telling us? It is saying the left hand side (LHS) must be zero, so the weighted sum of the function and its first two derivatives must equal 0 everywhere $f$ is defined. Therefore, the derivatives of $f$ must be expressible in terms of linear combinations of each other so that you can cancel out the functions for all values of x.
You can see that by using the above equation to solve for $f$ or one of its two derivatives in terms of the other two:
$f=-\frac{1}{c}(af''+bf')$
$f'=-\frac{1}{b}(af''+cf)$ and
$f''=-\frac{1}{a}(bf'+cf)$
As you can see, each derivative is equal to a linear combination of the other two, hence all must be expressible as linear combinations of some set of functions. If that were not the case, then there wold be some values of x where the above equation takes a non-zero value. As an analogy, think of the following very simple equation:
$x^2 + 3x + af(x) = 0$, where we want to solve for $f$. By simple algebra (one step!) we know $f$ must be expressed in terms of $x^2$ and $x$, otherwise there will be at least one x where we can get a non-zero value (e.g., if $f(x) = x^3 + x^2 + x$) there is no value for $a$ that will satisfy the equation for all x.
Likewise, as we saw above, linear differential equations with constant coefficients will equal 0 for all x only if the function and its derivatives can be expressed as linear combinations of a finite number of functions, which will allow us to select coefficient values so that all the deriviatives cancel out.
This can be extended to the case where there are "repeated roots" in the characteristic equation. Here, we have N roots and N-r functions. Hence, we are missing some functions in our "function space", which is similar to a vector space, except the unit vectors $\overrightarrow{\mathbf{i}},\overrightarrow{\mathbf{j}},\overrightarrow{\mathbf{k}}$ etc are replaced by abstract "directions" represented by functions, so that an abstract "vector" in this space is $\overrightarrow{\mathbf{f}}=\langle c_1,c_2,c_3...c_n\rangle\cdot\langle f_1,f_2,f_3...f_n\rangle$. Therefore, we have a situation where r of the "vectors" are collinear, just like with matrix algebra, and so the space is underdetermined, just like when you have N equations and N+M unknowns. You need more equations to get a unique solution. That is why we use the functions $x^ne^{kx}$, for n={1...M} where M is the degree of multiplicity for a particular root. This function has the unique property that its derviatives will be expressible in terms of lower values of n, and hence will be expressible in this expanded "basis" of functions.
Overall, that is the key property of the exponential in linear differential equations with constant coefficients - derivatives of the exponential, or x times the exponential, produce functions that do not stray outside the set of functions forming the basis for the abstract space of solutions. Other functions will not do this, which means that the higher derivatives will produce functions that are not present in the actual proposed solution, and will therefore not be able to be cancelled for all x. As an example, take $f(x) = \sum c_i\frac{1}{x^n}$ No matter what you choose for the $c_i$ the derivatives will produce functions that are not in $f$, and hence there is no set of constant coefficients that will allows the derivates to cancel out to zero for all x.
Hope that was a bit clearer than before.