Update: A solution appears on page 167 of the February 2012 issue of the American Mathematical Monthly. There were only three solvers, so we can deduce that this question was unusually difficult for a Monthly problem.


Lemma: Let $X$ and $Y$ be real symmetric $n \times n$ matrices, where the eigenvalues of $X$, $Y$ and $X+Y$ are $\alpha_1 \geq \alpha_2 \geq \cdots \geq \alpha_n$, $\beta_1 \geq \beta_2 \geq \cdots \geq \beta_n$ and $\gamma_1 \geq \gamma_2 \geq \cdots \geq \gamma_n$ respectively. Then $\sum_{i=1}^k \alpha_i + \sum_{i=1}^k \beta_i \geq \sum_{i=1}^k \gamma_i$. Moreover, if we have equality, then we can change bases so that $X$ and $Y$ are simultaneously block diagonal with blocks of size $k \times k$ and $(n-k) \times (n-k)$, and the eigenvalues of the first blocks are $(\alpha_1, \ldots, \alpha_k)$ and $(\beta_1, \ldots, \beta_k)$.

Remark: Note that I have deliberately used different variable names than Beni. One of the things that made this problem tricky for me was that I had to apply this lemma in several different ways, with different matrices playing the roles of $X$ and $Y$.

Proof: Thinking of $X$ and $Y$ as quadratic forms, it makes sense to restrict them to subspaces of $\mathbb{R}^n$. With this understanding, we have $\sum_{i=1}^k \alpha_i = \max_{V \in G(k,n)} \mathrm{Tr} X|_V$, where the max ranges over $k$-dimensional subspaces of $\mathbb{R}^n$. Moreover, the maximum is achieved precisely when $X \cdot V = V$, and the eigenvalues of $X|_V$ are $(\alpha_1, \ldots, \alpha_k)$. When all the $\alpha$'s are distinct, this can be expressed more easily by saying that $V$ is the span of the eigenvectors associated to the largest $k$ eigenvalues.

We want to show that $$\max_{V \in G(k,n)} \mathrm{Tr} X|_V + \max_{V \in G(k,n)} \mathrm{Tr} Y|_V \geq \max_{V \in G(k,n)} \mathrm{Tr} \left( X+Y \right)_V$$ or, equivalently, $$\max_{(V,W) \in G(k,n)^2} \mathrm{Tr} \left( X|_V + Y_W \right) \geq \max_{V \in G(k,n)} \mathrm{Tr} \left( X|_V +Y|_V \right)$$ But this is obvious, because the right hand maximum is over a larger set.

Moreover, if we have equality, then there is some $k$-plane $V$ with $XV=YV =V$ and where the eigenvalues of $X|_V$ and $Y|_V$ are $(\alpha_1, \ldots, \alpha_k)$ and $(\beta_1, \ldots, \beta_k)$. This is precisely the required claim about being block diagonal. $\square$

Let $A$ and $B$ be as in the problem. Let the eigenvalues of $A$ be $\lambda^{+}_1 \geq \cdots \geq \lambda^{+}_{p^{+}} > 0 =0=\cdots=0 > \lambda^{-}_{1} \geq \cdots \geq \lambda^{-}_{p_{-}}$. Define $\mu^{+}_i$, $\mu^{-}_i$, $q^{+}$ and $q^{-}$ similarly. Then, as Beni explains, we have $p^{+}+p^{-} + q^{+} + q^{-} \leq n$, and the eigenvalues of $X+Y$ are the concatenation of the $\lambda^{\pm}_i$'s, the $\mu^{\pm}_i$'s, and $n-p^{+}-p^{-} - q^{+} - q^{-}$ copies of $0$.

Apply the lemma with $k=p^{+}+q^{+}$, $X=A$ and $Y=B$. This shows that we can split into two smaller problems, one where all the $\lambda$'s and $\mu$'s are positive and another where they are all nonpositive. We will focus on the first block; the second block is similar.

So we are reduced to the case that $A$ and $B$ have no negative eigenvalues.

Now, we use induction on $n$. The base case $n=1$ is easy. Without loss of generality, let $\lambda^{+}_1 \geq \mu^{+}_1$. So $\lambda^{+}_1$ is the greatest eigenvalue of both $A$ and $A+B$.

Apply the lemma with $k=1$, $X=A+B$ and $Y=-B$. The greatest eigenvalue of $-B$ is $0$. So the greatest eigenvalue of $(A+B)+(-B)$ is the sum of the greatest eigenvalues of $A+B$ and $-B$, and we conclude that we can split off a $1 \times 1$ block from $A+B$ and $-B$, where the block of $-B$ is zero.

The product of the $1 \times 1$ blocks is $0$, and by induction the product of the other blocks is $0$ as well.