Eigenvalues of Matrix vs Eigenvalues of Operator
I'm having some trouble reconciling the concept of eigenvalues of operators with eigenvalues of matrices: Say you have an $n\times n$ matrix $A$. It represents a linear operator $T:V\to V$ with respect to some basis $\{e_i\}$ in the background. Now my understanding is that
1.) Whatever the basis is, it has no effect on the eigenvalues of $A$. I.e. solutions to $\text{det}(A-\lambda I)=0$ gives the same solutions regardless if we have $\{e_i\}$ or $\{e_i'\}$ as the basis in the background, so long as we keep the entries of $A$ the same in both cases.
2.) The eigenvalues of $A$ are the same as the eigenvalues of $T$ as long as we use the basis $\{e_i\}$ in which $A$ represents $T$.
However, if we keep the entries of $A$ the same, and change the basis in the background, then $A$ represents a different linear operator $T'$. This seems contradictory since \begin{align} \{\text{eigenvalues of} \ T\}&=\{\text{eigenvalues of} \ A \ \text{with respect to basis} \{e_i\}\} \ \ \text{by 2}\\ &=\{\text{eigenvalues of} \ A \ \text{with respect to basis} \{e_i'\}\} \ \ \text{by 1} \\ &=\{\text{eigenvalues of} \ T'\} \ \ \text{by 2} \end{align}
But there's no reason the two operators $T$ and $T'$ should have the same eigenvalues. Can someone point out what's wrong here? Any help would be appreciated.
Solution 1:
Two matrices $A,B \in M_n(\mathbb{F})$ are called similar if there exists an invertible matrix $P$ such that $P^{-1}AP = B$. Two linear maps $T,S \colon V \rightarrow V$ on a finite dimensional vector space are called similar if there exists an invertible linear transformation $R$ (which is the same thing as an isomorphism) such that $R^{-1} \circ T \circ R = S$. Consider the following statements:
- The maps $T,S$ are similar if and only if the matrices $[T]_{\mathcal{B}},[S]_{\mathcal{B}}$ representing the maps with respect to an arbitrary ordered basis $\mathcal{B}$ of $V$ are similar.
- Two matrices $A,B \in M_n(\mathbb{F})$ are similar if and only if for any $n$ dimensional vector space $V$, any choice of ordered basis $\mathcal{B}_1$ for $V$ and any operator $T \colon V \rightarrow V$ such that $[T]_{\mathcal{B}_1} = A$ you can find a basis $\mathcal{B}_2$ such that $[T]_{\mathcal{B}_2} = B$. Thus, similar matrices define the same linear operators up to a choice of basis.
- Two linear maps $T,S \colon V \rightarrow V$ are similar if and only if they can be represented by the same matrix with respect to two different ordered bases of $V$.
You can check directly that similar linear maps have the same eigenvalues and this will also imply that similar matrices have the same eigenvalues (as the eigenvalues of a matrix $A$ are precisely the eigenvalues of the linear map $T_A \colon \mathbb{F}^n \rightarrow \mathbb{F}^n$ associated to $A$). In your case, the maps $T$ and $T'$ are similar and hence they have the same eigenvalues.
Solution 2:
If you randomly give two linear operators $T$ and $T'$ on $V$, then of course, there is no reason at all they should have same eigenvalues. However, your $T$ and $T'$ are not randomly picked. You first give $T$ and define $A$ as its matrix respect to some basis $\alpha$. Then you choose another basis $\beta$ in $V$ and define $T'$ in such a way that its representation matrix with respect to $\beta$ is $$ [T']_\beta=A. $$ Consequently, your $T$ and $T'$ are related to each other in such a way that $$ [T]_\alpha=[T']_\beta. $$
As a consequence of the following exercise, your $T$ and $T'$ have the same eigenvalues.
Exercise: Show that $T$ and $[T]_\alpha$ have exactly same eigenvalues for any basis $\alpha$ in $V$.
Solution 3:
Why shouldn't there be two linear operators with the same eigenvalues?
Consider a vector space $V$ of dimension 2, with linear operators $T$, $T'$ such that $T(e_1) = e_1, T(e_2) = 2e_2$ and $T'(e_1) = 2e_1, T'(e_2) = e_2 $ for a basis of $V$, $\{e_1, e_2\}$.
You can check whether these are linear operators quite easily.