Prove $e^x, e^{2x}..., e^{nx}$ is linear independent on the vector space of $\mathbb{R} \to \mathbb{R}$

Prove $e^x, e^{2x}..., e^{nx}$ is linear independent on the vector space of $\mathbb{R} \to \mathbb{R}$

isn't it suffice to say that $e^y$ for any $y \in \mathbb{R}$ is in $\mathbb{R}^+$
Therefore, there aren't $\gamma_1, ...\gamma_n$ such that $\gamma_1e^x+\gamma_2e^{2x}...+\gamma_ne^{nx}=0$.
Therefore, they're not linear dependent.

I've seen a proof goes as follow:
take $(n-1)$ derivatives of the equation. then, you got $n$ equations with $n$ variables. Arranging it in a matrix (which is found out to be Van-Der-Monde matrix).
calculate the determinant which is $\ne 0$. Therefore, only the trivial solution exist. Therefore, no linear dependency.

Is all that necessary?


Solution 1:

Positiveness of the exponential is not enough as pointed out in the comments and anorton's answer.

Start from the equation $$\forall x\in\mathbb R, \quad \sum_{j=1}^n\gamma_je^{jx}=0.$$ Multiply this equation by $e^{-nx}$. We get for any $x$, $$\gamma_n+\sum_{j=1}^{n-1}\gamma_je^{(j-n)x}=0.$$ Now, letting $x\to+\infty$, we obtain $\gamma_n=0$. We can either repeat the procedure or write it properly by induction.

Solution 2:

To show that the functions $e^{\alpha_i x}$ are linearly independent over $\Bbb R$ for distinct, real but otherwise arbitrary $\alpha_i$, then the arguments presented by Davide Giraudo and julien in their answers seem to be the way to go. So, +1 for each of their answers and lauds for their fine work!

However, if one is primarily interested in showing the functions $e^x$, $e^{2x}$, . . . $e^{nx}$, etc., are linearly independent, that is, the functions $e^{mx}$ for positive integral $m$, as appears to be indicated in the title and body of the question as stated, then the following little trick may be of interest: suppose there existed $\beta _i \in \Bbb R$ with

$\sum_1^n \beta_i e^{ix} = 0; \tag{1}$

then, since $e^{ix} = (e^x)^i$, (1) becomes

$\sum_1^n \beta_i (e^x)^i = 0, \tag{2}$

that is, $e^x$ must be a (real) zero of the polynomial equation

$p(y) = \sum_1^n \beta_i y^i = 0, \tag{3}$

which implies that $e^x$ can only take on at most $n$ values which will be among the real zeroes of $p(y)$. And that just won't work, will it?

Note Added in Edit, Friday 10 January 2014 11:43 AM PST: In the light of the warm reception, in terms of upvotes, this answer has received and also in the light of Christoph Pegel's comment, I would like to point out that this approach goes a long way toward completely algebraicizing this problem, at least for functions of the form $e^{ix}$ and related. As indicated by Christoph Pegel, any function satisfying $f(mx) = (f(x))^m$ for positive integral $m$ will be susceptible to this argument, viz. we would have to have

$\sum_1^n \beta_i (f(x))^i = 0 \tag{4}$

if the $f(nx)$ were linearly dependent. Note that the fact that (3) or (4) have at most finite number of zeroes is a purely algebraic result, depending only on the Euclidean division formula $p(x) = (x - \lambda)q(x)$ for $\lambda$ a root of $p(x)$; in fact, (4) shows the $(f(x))^i$ are linearly independent if $f(x)$ takes on an infinite number of values. The results extend to the cases of other $f$, i.e. $f(x) = a^x$ for nonzero real $a$ etc., and apparently even to other base fields than $\Bbb R$. A pretty algebraic situation, really, for the $e^{mx}$ and for any other $f$ such that $f(mx) = ((f(x))^m$. End of Note.

Hope this helps! Cheers,

and as always,

Fiat Lux!!!