Solution 1:

Two vectors

An eigenvector $v$ of a transformation $A$ is a vector that, when the transformation is applied to it, doesn't change its direction, i.e., $Av$ is colinear to $v$. The only thing that may change is its length. The factor of that change is its eigenvalue $\lambda$.

So, if

$$Av_1 = \lambda_1 v_1, \quad Av_2 = \lambda_2 v_2, \quad \lambda_1 \ne \lambda_2,$$

it means that $A$ stretches vectors $v_1$ and $v_2$ differently. This would, of course, be impossible, had they had the same direction, i.e., if they were colinear, which is the same as being linearly dependent.

More than two vectors

Let us assume that we have eigenvectors $(v_i)_{i=1}^k$ with their respective eigenvalues $(\lambda_i)_{i=1}^k$, where $k > 2$. Assume that the vectors are arranged in a way that $\mathcal{B} := \{v_1,\dots,v_j\}$ is linearly independent (for some $j$, $1 < j < k$), while $\{v_1,\dots,v_j,v_l\}$ is linearly dependent for all $l > j$.

Obviously, $\mathcal{B}$ forms basis of the space spanned by vectors $(v_i)_{i=1}^k$. Geometrically, however, every linear combination of vectors in $\mathcal{B}$ forms a parallelepiped (in $j$ or less dimensions) with the vertex oposite of $0$ being $v_l := \sum_{i=1}^j \alpha_i v_i$. Now, what happens with that parallelepiped when we apply a linear transform?

Since all the eigenvalues $(\lambda_i)_{i=1}^j$ are distinct, each of its edges stretches differently, which means that the diagonal of that parallelepiped will not be colinear with the original. In other words,

$$\not\exists \lambda_l \colon A v_l = \lambda_l v_l.$$

In other words, a vector linearly dependent with the eigenvectors having distinct eigenvalues cannot be an eigenvector itself (unless it's a trivial case $v_l = \alpha_p v_p$ for some $p \le j$).

Solution 2:

$Av_1=\lambda_1v_1$, $Av_2=\lambda_2v_2$, $\lambda_1\neq\lambda_2$, $v_1,v_2\neq0$. Suppose $v_2=cv_1$. Then $Av_2=\lambda_2v_2=c\lambda_2v_1$ and $Av_2=Acv_1=cAv_1=c\lambda_1v_1$, hence $c(\lambda_2-\lambda_1)=0\implies c=0\implies v_2=0$.

Solution 3:

Hint: If some eigenvector $v$ lies on eigenspace corresponding to eigenvalue $\lambda$, then eigenvalue of $v$ must also be $\lambda$.