Upper bound for the sum of absolute values of the eigenvalues
Let $\mathbf A = (a_{ij})$ be an $n\times n$ real or complex matrix with eigenvalues $\lambda_j$, for $j=1,...,n$. It is known that $\max|\lambda_j|$ is bounded above by the maximum row sum of $A$ (using entrywise absolute values). Does the following related bound also hold?
Question: Is it true that $$ \sum_{j=1}^n|\lambda_j| \le \sum_{i,j=1}^{n}|a_{ij}|, $$ where the right hand side is summed over all matrix elements$\,$?
For example, this inequality is obvious for triangular matrices as well as positive definite matrices and slightly less obvious for unitary matrices (using the fact that the rows are unit vectors). It can also be proved in other special cases but is it true in general?
Thanks.
Solution 1:
Note. An earlier version of this answer only works for normal matrices, thanks to themaker for pointing this out. Fortunately, the answer to this wonderful post appears to furnish the step to generalize the proof to an arbitrary complex square matrix.
Statement
We want to show that for a complex matrix $\mathbf A$, $$ \sum_{i = 1}^n |\lambda_i| \le \sum_{i, j = 1}^n |a_{ij}|. \tag{1} $$ The trick is to use the SVD decomposition, or the Schur decomposition, but we shall go through some simpler cases first.
Diagonal matrix
If the matrix is diagonal, then the eigenvalues coincide with the eigenvalues $$ \lambda_i = a_{ii}. $$ The statement is obviously true, for $$ \sum_{i=1}^n |\lambda_{i}| = \sum_{i=1}^n |a_{ii}| \le \sum_{i,j=1}^n |a_{ij}|. $$
Positive definite matrix
Second, we observe that the sum of eigenvalues is equal to the trace of the matrix (i.e., the sum of the diagonal). Each eigenvalue is a root of \begin{align} 0 &= |\lambda - \mathbf A| = \lambda^n - \sum_{i = 1}^n a_{ii} \, \lambda^{n-1} + \cdots. \end{align} By Vieta's formula, we have \begin{align} \sum_{i=1}^n \lambda_i = \sum_{i = 1}^n a_{ii}. \end{align}
Now if the matrix is positive definite, we are done, because \begin{align} \sum_{i = 1}^n |\lambda_i| &= \sum_{i = 1}^n \lambda_i = \sum_{i = 1}^n a_{ii} \le \sum_{i,j=1}^n |a_{ij}|. \end{align}
General
Generally, a complex matrix permits a singular value decomposition: $$ \mathbf A \equiv \mathbf U \mathbf \Sigma \mathbf V^\dagger, $$ where $\mathbf U$ and $\mathbf V$ are unitary matrices, and $\mathbf \Sigma$ is a diagonal matrix filled by nonnegative numbers, $\sigma_{ii}$, called the singular values.
Now in this post, 23rd showed, by using the Schur decomposition, that $$ \sum_{i = 1}^n |\lambda_i| \le \sum_{i = 1}^n \sigma_{ii} = \mathrm{Tr} \mathbf \Sigma, \tag{2} $$
Since the trace is an invariant under a similarity transformation: $$ |\lambda - \mathbf A| = |\mathbf P^{-1}(\lambda - \mathbf A)\mathbf P| = |\lambda - \mathbf P^{-1} \mathbf A \mathbf P|, $$ we have \begin{align} \mathrm{Tr} \mathbf \Sigma &= \mathrm{Tr} \left( \mathbf U^\dagger \mathbf A \mathbf V \right) \\ &= \mathrm{Tr} \left( \mathbf V \, \mathbf U^\dagger \mathbf A \right) = \mathbf{Tr} \left( \mathbf A \mathbf V \, \mathbf U^\dagger \right) . \tag{3} \end{align}
If we define $\mathbf W = \mathbf V \mathbf U^\dagger$, and denote the $k$th column of $\mathbf A$ by $\mathbf a_k$, then \begin{align} \sum_{i = 1}^n |\lambda_i| &\le \mathrm{Tr} \mathbf \Sigma \\ &= \mathrm{Tr} \left( \mathbf W \mathbf A \right) = \sum_{i = 1}^n \left( \mathbf W \mathbf a_i \right)_i. \\ &\le \sum_{i = 1}^n \| \mathbf W \mathbf a_i \| = \sum_{i = 1}^n \| \mathbf a_i \| \tag{4} \\ &\le \sum_{i,j=1}^n |a_{ij}|. \end{align} where, $\| \dots \|$ denotes the Euclidean norm, and we have used the fact that $\mathbf W$ is a unitary matrix, such that $$ \| \mathbf W \mathbf a_i \|^2 = (\mathbf W \mathbf a_i)^\dagger (\mathbf W \mathbf a_i) = \mathbf a_i^\dagger \mathbf W^\dagger \mathbf W \mathbf a_i = \mathbf a_i^\dagger \mathbf a_i = \| \mathbf a_i \|, $$ as well as the inequality $$ \| \mathbf x \| = \sqrt{ \sum_{i = 1}^n |x_i|^2 } < \sum_{i = 1}^n |x_i|. $$
In fact, we have just shown a stronger statement, (4). Also note that $\mathbf a_i$ can be replaced by a row of $\mathbf A$, if we use the second form of (3) and follow a similar line of reasoning.
Note
It is actually easier, and more efficient, to argue the same thing we use the Schur decomposition, following the post cited.