Relation between left and right eigenvectors corresponding to the same eigenvalue
I have a general question on how the left eigenvectors and right eigenvectors of a matrix are related to each other.
Background. It is easy to see that the characteristic polynomial of a $A$ and $A^\top$ are the same, hence the "left" and "right" eigenvalues of $A$ are the same. Is there any geometric reason on why this should happen? And moreover, why there should be any relations between the left and right eigenvectors corresponding to the same eigenvalue?
To be more clear, I can prove the following:
Observation. Let $A \in \mathbb{C}^{n\times n}$ have $n$ distinct eigenvalues. Then for an eigenvalue $\lambda$ and corresponding left eigenvector $u^\top$ and right eigenvector $v$, we have $u^\top v \neq 0$.
Proof. Let $J$ be the Jordan canonical form of $A$. Since all the eigenvalues of $A$ are simple, $J$ is diagonal. Let $A = SJS^{-1}$ for some invertible matrix $S$. Observe that for an eigenvalue $\lambda$ there is an $1 \leq i \leq n$ such that the $i$-th column of $S$, $s^i$, is a right eigenvector of $A$ for the eigenvalue $\lambda$, and the $i$-th row of $S^{-1}$, ${s_i}^\top$, is a left eigenvector of $A$ for the eigenvalue $\lambda$. Since $S^{-1} S = I$ we have ${s_i}^\top s^i = 1$. This implies $u^\top v \neq 0$.
Note that this need not be true in general. For example when there isn't a full set of eigenvectors, like in $$\begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0\end{bmatrix}.$$ So, I want to make a claim as following:
Claim. If $\lambda$ is an eigenvalue of $A$ where its geometric multiplicity is equal to its algebraic multiplicity, then there are left and right eigenvectors of $A$ corresponding to $\lambda$, respectively $u^\top$ and $v$, such that $u^\top v \neq 0$.
Note that the claim can't be true for all left and right eigenvectors, instead of there is. For example consider the identity matrix.,
So, my questions are:
Questions.
Is there any geometric reasons that eigenvalues of $A$ and $A^\top$ are equal?
Is there any intuitive way to see why the observation above holds?
Is the claim above true?
--
Owen Biesel in their comment to this question mentions that left eigenvectors are perpendicular to hyperplanes that are preserved under left multiplication. In that sense, that would mean $u^\top$ and $v$ are perpendicular if $v$ is in the hyperplane perpendicular to $u^\top$. But I can't quite make a connection to prove what I want.
Solution 1:
Your claim is true. It suffices to modify the proof in your question slightly. If the algebraic and geometric multiplicities of some eigenvalue $\lambda$ coincide, then $A=P\pmatrix{\lambda I\\ &M}P^{-1}$ for some change-of-basis matrix $P$ and some submatrix $M$ that is the direct sum of the Jordan blocks for other eigenvalues. Therefore $u^T=e_1^TP^{-1}$ and $v=Pe_1$ are respectively a left eigenvector and a right eigenvector corresponding to $\lambda$, with $u^Tv=1\ne0$.
$A$ and $A^T$ have identical eigenvalues basically because a matrix $B$ (take $B=\lambda I-A$ in your case) is singular if and only if its $B^T$ is singular. And the geometric reason for the latter to hold is that every linear map on a finite-dimensional vector space can be decomposed into the product of a number of shears, transpositions and also a scaling function. That is, $B=E_1E_2\ldots E_kDF_\ell\ldots F_2F_1$ where each $E_i$ or $F_j$ is either a shear matrix (an elementary row/column operation for row/column addition) or a transposition matrix (an elementary operation for exchanging two rows/columns) and $D$ is a scaling (diagonal) matrix. As shear matrices are nonsingular, $B$ is singular iff $D$ is singular iff $B^T$ is singular.
Solution 2:
I think that the best reason for the $A$ and $A^T$ having the same eigenvalues is the fact that they are similar matrices. This can be seen using the Jordan canonical forms, and the fact conjugating a matrix by an "anti-diagonal" corresponds to rotating the matrix by 180º. So, if $J_{\lambda}$ is a Jordan block then $Y J_{\lambda}^T Y$ is also a Jordan block, where $$ Y = \left[ \begin{array}{ccc} & &1\\ &\cdots & \\ 1 & & \end{array} \right] \ . $$