Sum of two eigenvectors for different eigenvalues
Let $V$ be a $k$-vector space and $f\in\text{End}(V)$. Assume that we have eigenvectors $v,w$ of $f$ for different eigenvalues and choose $a,b\in k\setminus\{0\}$.
Claim. $av+bw$ is not an eigenvector of $f$.
Choose $\lambda,\mu$ eigenvalues for $v,w$. We get $$f(av+bw)=af(v)+bf(w)=a\lambda v+b\mu w .$$ If there is an $\alpha$ with $f(av+bw)=\alpha(av+bw)$ then we would get $$a\lambda v+b\mu w=\alpha av+\alpha bw\Rightarrow (\lambda -\alpha)av+(\mu -\alpha)bw=0.$$
I think this won't help. Any other ideas?
Solution 1:
Since $v,w$ are vectors for different eigenvalues, they are linear independent. Then we can plug in the definition of a linear function and eigenvector to get
$$
f(av+bw)=f(av)+f(bw)=\lambda av+ \mu bw
$$
Assuming it was an eigenvector, there would be some $\beta$ :
$$
\lambda av+ \mu bw=\beta(av+bw) \\
(\lambda-\beta) av+ (\mu - \beta) bw=0
$$
Now we know that $v,w$ are linear independent. So the only way to get this expression to zero is if $(\lambda-\beta)a$ and $(\mu-\beta)b=0$. Since $a,b$ are not zero, the other factor has to be $0$. However, this would imply $\beta=\lambda=\mu$. A contradiction.