You need to check if the functions are independent, as you said.

A way to go about this, which that ties it in with things you likely know is to evaluate it at several points, as you did for $x=0$.

You get one condition for $x=0$. You get another condition for $x=1$ and still another one for $x=2$.

Each will allow more than one solution, but they'll only have one common solution, which is what you are after.


Write $$\alpha e^x + \beta e^{2x} + \gamma e^{3x} = 0$$ You can go ahead and cancel out a positive number like $e^x$ so: $$\alpha + \beta e^{x} + \gamma e^{2x} = 0$$ Suppose you have some solution for this with $\alpha$, $\beta$, $\gamma$ not all zero. Then, as you say $$ \alpha + \beta + \gamma = 0\qquad \qquad (1)$$ Because this must be true at $x = 0$ but it must also be true at $x = \ln n$ which gives: $$ \alpha + \beta n + \gamma n^2= 0\qquad \qquad (2)$$ for every $n > 1$. It should be clear that this is unsolvable except when they are all zero. But to press the point I'll continue. Substituting in $(1)$ gives $\alpha = -\beta - \gamma$, which we can plug into $(2)$ to get $$ \beta (n-1) + \gamma (n^2 - 1)= 0$$ which must be true for all $n > 1$. Now put, say, $n = 2$ and $n = 3$ to get the pair of equations: $$ \beta + 3 \gamma = 0 \qquad 2\beta + 8\gamma = 0 $$ This solves for $\beta = \gamma = 0$.

So your functions are proved to be linearly independent.


Hint:

let $e^x=y$, $e^{2x}=y^2$, $e^{3x}=y^3$ you have:

$\alpha y +\beta y^2+ \gamma y^3=0$

where the $0$ at RHS is the zero polynomial.

Now: when a polynomial is the zero polynomial?

In general:

The $0$ at RHS is the neutral element for the sum of functions in the vector space, not simply the number $0$ and this means that it is the function $f(x)=0\quad \forall x \in \mathbb{R}$.


You have to prove $$ \forall x\in\mathbb{R}:\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0\Leftrightarrow\alpha,\beta,\gamma=0, $$

but I think the quantifier applies only to the part on the left side of the $\Leftrightarrow$, like this: $$ \left(\forall x\in\mathbb{R}:\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0\right) \Leftrightarrow\,\alpha,\beta,\gamma=0. $$

So for example $\alpha = -1, \beta = 1, \gamma = 0$ satisfies $\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0$ when $x=0$, but it doesn't satisfy the equation for all values of $x$.

If you had to prove $$ \forall x\in\mathbb{R}:\left(\alpha e^{x}+\beta e^{2x}+\gamma e^{3x}=0 \Leftrightarrow\,\alpha,\beta,\gamma=0\right) $$ then you would be in trouble, because that statement is not true; but that's not how we prove independence of the functions, so you don't need to worry about that.


Hint: Use Wronskian and show that the Wronskian-Determinant does not vansish.