Does $AB+BA=0$ imply $AB=BA=0$ when $A$ is real semidefinite matrix?

Solution 1:

Let $v$ be an eigenvector of $A$ with eigenvalue $\lambda$. Then $$ 0 = A(Bv) + B(Av) = A(Bv)+\lambda Bv\Longrightarrow A(Bv) = -\lambda Bv $$ what this means is that $Bv$ is also an eigenvector of $A$ with eigenvalue $-\lambda$. But since $A$ is positive semi-definite all eigenvalues of $A$ are $\geq 0$. If $\lambda=0$, so we have two possiblities:

  • Either $v$ is an eigenvector with eigenvalue zero, in which case $A(Bv) = 0$, This means $ABv = BAv = 0$ for all eigenvectors corresponding to eigenvalue zero.
  • Or $v$ is an eigenvector with eigenvalue $\lambda>0$. Suppose $Bv\neq 0$, then $Bv$, by above observation is an eigenvector of $A$ with negative eigenvalue! This is impossible, so $Bv=0$. Therefore no only $ABv = 0$, we also have $BAv = \lambda Bv = 0$.

Since all vectors can be written as a linear composition of these eigenvectors, we have $AB=BA = 0$ in these circumstances.

Extra: If you are also interested in what this $B$ can actually be if $AB+BA = 0$, note that if $w$ is any vector, and $P_0$ is the porjection linear transformation which sends $w$ to the subspace of vectors with eigenvalue zero. Then we know the following:

Write $w = P_0w + (1-P_0)w$, then by second observation above $B(1-P_0)w = 0$. So in the basis in which $A$ is diagonal (since $A$ is symmetric it is diagonalizable), $B$ is necessarily of the form $$ B=\begin{pmatrix} B_0 & 0\\ 0 & 0 \end{pmatrix} $$ In other words $B$ satisfies $AB+BA=0$ if and only if $P_0BP_0=B$.

Solution 2:

Proposition. Let $A,B\in M_n(\mathbb{C})$; if $A$ is diagonalizable with non-negative eigenvalues and $AB+BA=0$, then

  1. $AB=BA=0$.

  2. If moreover $A$ is hermitian, then $A,B$ are unitarily simultaneously similar to $A_1=diag(D_p,0_{n-p}),B_1=\begin{pmatrix}0_p&0\\0&T_{n-p}\end{pmatrix}$, where $D$ is diagonal $> 0$ and $T$ is upper triangular.

Proof. For 1. cf. the Hamed's answer.

For 2. Using the syeh_106's proof, $A,B$ are unitarily simultaneously similar to $A_1=diag(D_p,0_{n-p})),B'=\begin{pmatrix}0_p&0\\0&U_{n-p})\end{pmatrix}$, where $D$ is diagonal $> 0$. It remains to triangularize $U$, using Schur's method.

Solution 3:

An alternative proof is as follows. Since $A$ is psd, $A=UDU^*$, where $U$ is unitary and $D$ is diagonal. Suppose $A$ has $k$ distinct eigenvalues, $\lambda_1 < \lambda_2 <...< \lambda_k$. Then $D$ may be chosen to be $D=\lambda_1I_{m_1}\oplus\lambda_2I_{m_2}\oplus...\oplus\lambda_kI_{m_k}$, where $m_i$ denotes the multiplicity of $\lambda_i$, and $I_{m_i}$ is an $m_i$-by-$m_i$ identity matrix. (I.e. $D$ is block-diagonal, consisting of $k$ blocks.)

Now, note that $AB+BA=0 \Rightarrow UDU^*B=-BUDU^* \Rightarrow DU^*BU=-U^*BUD.$ Letting $\hat B\triangleq U^*BU$, we have $D\hat B=-\hat B D$. Given the form of $D$, it's easy to verify that $\hat B$ must be block-diagonal, i.e. $\hat B=\hat B_1\oplus\hat B_2\oplus...\oplus\hat B_k$, where $\hat B_i$ is of the same size as $I_{m_i}$ Moreover, since $\lambda_2, ..., \lambda_k$ must be positive, we conclude that $\hat B_2, ..., \hat B_k$ must all be zero matrices. If $\lambda_1$ is also positive (i.e. $A$ is p.d.), $\hat B_1$ must also be zero. So $\hat B=0$ and $D\hat B=\hat B D=0$. On the other hand, if $\lambda_1=0$ (i.e. $A$ is p.s.d.), a direct multiplication clearly also shows $D\hat B=\hat B D=0$.

Since $AB=UD\hat BU^*$ and $BA=U\hat BDU^*$, it follows that $AB=BA=0$.