How to prove that eigenvectors from different eigenvalues are linearly independent

How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?


Solution 1:

I'll do it with two vectors. I'll leave it to you do it in general.

Suppose $\mathbf{v}_1$ and $\mathbf{v}_2$ correspond to distinct eigenvalues $\lambda_1$ and $\lambda_2$, respectively.

Take a linear combination that is equal to $0$, $\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2 = \mathbf{0}$. We need to show that $\alpha_1=\alpha_2=0$.

Applying $T$ to both sides, we get $$\mathbf{0} = T(\mathbf{0}) = T(\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2) = \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2.$$ Now, instead, multiply the original equation by $\lambda_1$: $$\mathbf{0} = \lambda_1\alpha_1\mathbf{v}_1 + \lambda_1\alpha_2\mathbf{v}_2.$$ Now take the two equations, $$\begin{align*} \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2\\ \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_1\mathbf{v}_2 \end{align*}$$ and taking the difference, we get: $$\mathbf{0} = 0\mathbf{v}_1 + \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2 = \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2.$$

Since $\lambda_2-\lambda_1\neq 0$, and since $\mathbf{v}_2\neq\mathbf{0}$ (because $\mathbf{v}_2$ is an eigenvector), then $\alpha_2=0$. Using this on the original linear combination $\mathbf{0} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2$, we conclude that $\alpha_1=0$ as well (since $\mathbf{v}_1\neq\mathbf{0}$).

So $\mathbf{v}_1$ and $\mathbf{v}_2$ are linearly independent.

Now try using induction on $n$ for the general case.

Solution 2:

Alternative:

Let $j$ be the maximal $j$ such that $v_1,\dots,v_j$ are independent. Then there exists $c_i$, $1\leq i\leq j$ so that $\sum_{i=1}^j c_iv_i=v_{j+1}$. But by applying $T$ we also have that

$$\sum_{i=1}^j c_i\lambda_iv_i=\lambda_{j+1}v_{j+1}=\lambda_{j+1}\sum_{i=1}^j c_i v_i$$ Hence $$\sum_{i=1}^j \left(\lambda_i-\lambda_{j+1}\right) c_iv_i=0$$ which is a contradiction since $\lambda_i\neq \lambda_{j+1}$ for $1\leq i\leq j$.

Hope that helps,

Solution 3:

Hey I think there's a slick way to do this without induction. Suppose that $T$ is a linear transformation of a vector space $V$ and that $v_1,\ldots,v_n \in V$ are eigenvectors of $T$ with corresponding eigenvalues $\lambda_1,\ldots,\lambda_n \in F$ ($F$ the field of scalars). We want to show that, if $\sum_{i=1}^n c_i v_i = 0$, where the coefficients $c_i$ are in $F$, then necessarily each $c_i$ is zero.

For simplicity, I will just explain why $c_1 = 0$. Consider the polynomial $p_1(x) \in F[x]$ given as $p_1(x) = (x-\lambda_2) \cdots (x-\lambda_n)$. Note that the $x-\lambda_1$ term is "missing" here. Now, since each $v_i$ is an eigenvector of $T$, we have \begin{align*} p_1(T) v_i = p_1(\lambda_i) v_i && \text{ where} && p_1(\lambda_i) = \begin{cases} 0 & \text{ if } i \neq 1 \\ p_1(\lambda_1) \neq 0 & \text{ if } i = 1 \end{cases}. \end{align*}

Thus, applying $p_1(T)$ to the sum $\sum_{i=1}^n c_i v_i = 0$, we get $$ p_1(\lambda_1) c_1 v_1 = 0 $$ which implies $c_1 = 0$, since $p_1(\lambda_1) \neq 0$ and $v_1 \neq 0$.

Solution 4:

For eigenvectors $\vec{v^1},\vec{v^2},\dots,\vec{v^n}$ with different eigenvalues $\lambda_1\neq\lambda_2\neq \dots \neq\lambda_n$ of a $ n\times n$ matrix $A$.

Given the $ n\times n$ matrix $P$ of the eigenvectors (with eigenvectors as the columns). $$P=\Big[\vec{v^1},\vec{v^2},\dots,\vec{v^n}\Big]$$

Given the $ n\times n$ matrix $\Lambda$ of the eigenvalues on the diagonal (zeros elsewhere): $$\Lambda = \begin{bmatrix} \lambda_1 & 0 & \dots & 0 \\ 0 & \lambda_2 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & \lambda_n \end{bmatrix} $$ Let $\vec{c}=(c_1,c_2,\dots,c_n)^T$

We need to show that only $c_1=c_2=...=c_n=0$ can satisfy the following: $$c_1\vec{v^1}+c_2\vec{v^2}+...= \vec{0^{}}$$ Applying the matrix to this equation gives: $$c_1\lambda_1\vec{v^1}+c_2\lambda_2\vec{v^2}+...+c_n\lambda_n\vec{v^n}= \vec{0^{}}$$ We can write this equation in the form of vectors and matrices:

$$P\Lambda \vec{c^{}}=\vec{0^{}}$$

But with since $A$ can be diagonalised to $\Lambda$, we know $P\Lambda=AP$ $$\implies AP\vec{c^{}}=\vec{0^{}}$$ since $AP\neq 0$, we have $\vec{c}=0$.