If $\rho\leq\sigma$, is $\operatorname{rank}(\rho)\leq\operatorname{rank}(\sigma)$?

For positive semidefinite matrices $\rho$ and $\sigma$, I wonder if the following statement holds

If $\rho\leq\sigma$, is $\operatorname{rank}(\rho)\leq \operatorname{rank}(\sigma)$

Here $A\leq B$ implies that $B-A$ is positive semidefinite. If they commute this is easy to see but what is the case for non-commuting $\rho$ and $\sigma$?


Solution 1:

The answer to this is yes. An easy proof is as follows: suppose that $\rho,\sigma$ have size $n$ with $\rho \leq \sigma$. Note that

  • $\dim \ker \rho = n - \operatorname{rank}(\rho)$,
  • $\ker \rho = \{x \in \Bbb R^n : x^T\rho x = 0\}$,
  • $x^T\rho x \leq x^T \sigma x$ for all $x \in\Bbb R^n$.

Conclude that $\ker(\rho) \supseteq \ker (\sigma)$, so that $\dim \ker \rho \geq \dim \ker \sigma$, so that $\operatorname{rank}(\rho) \leq \operatorname{rank}(\sigma)$.

Note: in this definition of positive semidefinite, a positive semidefinite matrix is required to be symmetric (or Hermitian in the complex case). For the alternative definition where we only require $x^T\rho x \geq 0$ for real vectors $x$, the statement does not hold.


Proof that $\ker \rho = \{x \in \Bbb R^n : x^T\rho x = 0\}$: it is clear that $\ker \rho \subseteq \{x \in \Bbb R^n : x^T\rho x = 0\}$, but the reverse inclusion requires proof.

One option is to see this as a consequence of the Rayleigh-Ritz theorem: if $x^T\rho x$ minimizes $x^T\rho x$ subject to the constraint that $\|x\| = 1$, then $x$ must be an eigenvector. If $\rho x = \lambda x$ and $x^T\rho x = 0$, then we must have $\lambda x^Tx = 0 \implies \lambda = 0$, so that we indeed have $\rho(x) = 0$.

Another option is to use the existence of a positive semidefinite square root: let $\tau = \sqrt{\rho}$. note that $$ \tau x = 0 \iff (\tau x)^T(\tau x) = x^T\rho x = 0. $$ However, $\tau^2 = \rho$ implies that $x^T \rho x = 0 \implies \tau x = 0 \implies \rho x = \tau(\tau x) = 0$.

We could similarly make use of the existence of the Cholesky decomposition.

Another option, while I'm at it. Let $\rho$ be a positive semidefinite matrix, and suppose for the purpose of contradiction that $x^T \rho x = 0$ but $\rho x \neq 0$. Let $B$ denote the matrix with columns $x,\rho x$. We find that $$ B^T \rho B = \pmatrix{x^T\rho x & x^T \rho^2 x\\ x^T \rho^2 x & x^T \rho^3 x} = \pmatrix{0 & \|\rho x\|^2 \\ \|\rho x\|^2 & x^T \rho^3 x}. $$ $B^T \rho B$ has negative determinant and therefore fails to be positive semidefinite, but this is impossible. Equivalently, we could have shown directly that $(By)^T\rho(By) \geq 0$ cannot hold for all $y \in \Bbb R^2$.

Solution 2:

Suppose $x \in \text{ker }B$. Since $B-A$ is positive semidefinite we have $x^T(B-A)x \geq 0$; hence $-x^TAx \geq 0$ since $Bx=0$. But we also have that $A$ is positive semidifinite, so $x^TAx \geq0$. That is we have $x^TAx = 0$. Since $A$ is positive semidefinite, there exists a basis, $(v_i)_{i \in I}$ of orthogonal eigenvectors for $A$ with $v_i^Tv_i = 1$. Expand $x$ in this basis: $x = \sum a_iv_i$ then $Ax = \sum \lambda_ia_iv_i$ where $\lambda_i$ are the non-negative eivenvalues of $A$. Since the $v_i$ are orthogonal we get: $x^TAx = \sum \lambda_ia_i^2$. For this to be $0$ we must have $a_i=0$ when $\lambda_i > 0$. So $x$ is a linear combination of eigenvectors with eigenvalue $0$. In other words $x \in \text{ker }A$. We have shown that $\text{ker }B \subseteq \text{ker }A$. Hence $\text{rank }A \leq \text{rank }B$.