If two matrices have the same eigenvalues and eigenvectors are they equal?

The question stems from a problem i stumbled upon while working with eigenvalues. Asking to explain why $A^{100}$ is close to $A^\infty$

$$A= \left[ \begin{array}{cc} .6 & .2 \\ .4 & .8 \end{array} \right] $$ $$A^\infty= \left[ \begin{array}{cc} 1/3 & 1/3 \\ 2/3 & 2/3 \end{array} \right] $$ Answer being given that (skipping calculations) that $A$ has eigenvalues $\lambda_1=1$ and $\lambda_2=0.4$ with eigenvectors $x_1=(1,2)$ and $x_2=(1,-1)$, and $A^{\infty}$ has eigenvalues $\lambda_1=1$ and $\lambda_2=0$, with same eigenvectors, while $A^{100}$ has eigenvalues $\lambda_1=1$ and $\lambda_2=(0.4)^{100}$ with same eigenvectors as the others, concluding that as the eigenvectors are the same and the eigenvalues are close comparing $A^\infty$ and $A^{100}$ they must be close.

Creating the basis for my question, how can one conclude that two matrices with same eigenvectors and close/equal eigenvalues are close/equal to each other?

My initial thoughts is that two matrices with equal eigenvectors and eigenvalues founds the basis for the same transformation which is why they are equal - Am I completly off?


Solution 1:

The two matrices $\left[ \begin{array}{cc} 1 & 0 \\ 1 & 1 \end{array} \right] $ and $\left[ \begin{array}{cc} 1 & 0 \\ 200 & 1 \end{array} \right] $ both have the same eigenvalues and eigenvectors, but they are nowhere near equal to each other.

But if an $n\times n$ matrix has $n$ distinct eigenvalues or otherwise has a set of eigenvectors that form a basis of $\mathbb R^n,$ then the only matrix that has the same eigenpairs, i.e. the same eigenvectors, each with the same eigenvalue, is that same matrix. That is because a linear transformation is completely determined by what it does with a basis.

Solution 2:

If, as the other answers assume, there are $n$ independent eigenvectors, i.e. if the matrices are diagonalizable, then the answer to your question is yes; using the $n$ independent eigenvectors as the base we see that the matrices (in that base) are identical and therefore they are identical in all bases. However, if the matrices are not diagonalizable, i.e. if there are no $n$ independent eigenvectors, then the matrices are not necessarily the same. For example:

$$A = \begin{pmatrix} 1&1\\ 0&1\\ \end{pmatrix}\;\;\text{and}\;\; B = \begin{pmatrix} 1&2\\ 0&1\\ \end{pmatrix}\,, $$

are different matrices that have the same eigenvectors with the same eigenvalues ($1$ is the only eigenvalue and its eigenspace is one-dimensional).

Solution 3:

Besides needing a complete base of eigenvectors (as others pointed out), order is important if your "equal" is componentwise-equal. The following two matrices have the same set of eigenvalues and the same set of eigenvectors:

$$A = \begin{pmatrix} 2&0\\ 0&1\\ \end{pmatrix}\;\;\text{and}\;\; B = \begin{pmatrix} 1&0\\ 0&2\\ \end{pmatrix}\,. $$

(Conversely, having different eigenvectors does not necessarily mean the matrices are different. Any base of each eigenspace works as eigenvectors. (Think scaling or repeated eigenvalues.))