If $\mathrm{Tr}(A)=0$ then $T=R^{-1}AR$ has all entries on its main diagonal equal to $0$

Solution 1:

$\newcommand{\trace}[0]{\mathrm{trace}}$A proof by induction on the size $n$ of $A$ appears to be possible, the case $n = 1$ being obvious.

We have an implicit assumption that the characteristic of the underlying field is zero here. (Otherwise $\begin{bmatrix}1&0\\0&1\end{bmatrix}$ is a counterexample in characteristic $2$, which can be extended to all positive characteristics.)

So $A$ is not scalar (unless $A = 0$ and we're done), so there is a vector $v \ne 0$ which is not an eigenvector. In particular $v$ and $A v$ are independent. Consider a basis $\mathcal{B}$ which starts with $v, A v$. The matrix of $A$ with respect to this basis will be of the form $$ \begin{bmatrix} 0 & a\\ b & C\\ \end{bmatrix}, $$ where $C$ is a square $(n-1) \times (n-1)$ matrix, and $b, c$ are vectors of the appropriate size.

(This seems to me like a quicker proof of Lemma 2 of the paper quoted in a comment.)

(Actually $$ b = \begin{bmatrix} 1\\0\\\vdots\\0 \end{bmatrix}, $$ as $A$ maps the first element of the basis $\mathcal{B}$ onto the second element.)

Now clearly $\trace(C) = 0$. By the induction hypothesis, there is an invertible $(n-1) \times (n-1)$ matrix $Z$ such that all elements on the diagonal of $Z^{-1} C Z$ are zero.

Now $$ \begin{bmatrix} 1 & 0\\ 0 & Z\\ \end{bmatrix}^{-1} \begin{bmatrix} 0 & a\\ b & C\\ \end{bmatrix} \begin{bmatrix} 1 & 0\\ 0 & Z\\ \end{bmatrix} = \begin{bmatrix} 0 & a Z\\ Z^{-1} b & Z^{-1} C Z\\ \end{bmatrix}, $$ and we're done.


The proof can be quickly adapted to yield Corollary 4 of the paper mentioned above, that is, you can conjugate any matrix $A$ to a matrix where all diagonal elements are the same. So this diagonal element has to be $\lambda = \trace(A)/n$. (But please do check Marc van Leeuwen's comment below, to the effect that you can reduce the general case to the case of $\trace(A) = 0$ already treated above.)

In fact, if $A$ is not already scalar, there is a nonzero vector $v$ which is not an eigenvector. So $v, A v$ are linearly independent. Consider a basis $\mathcal{B}$ which starts like $v, A v - \lambda v$. With respect to this basis, $A$ has the form $$ \begin{bmatrix} \lambda & a\\ b & C\\ \end{bmatrix}, $$ as $A v = \lambda v + (A v - \lambda v)$.

Now $\trace(C) = (n-1) \lambda$, so induction applies.