Why are singular values always non-negative?

I have read that the singular values of any matrix $A$ are non-negative (e.g. wikipedia). Is there a reason why?

The first possible step to get the SVD of a matrix $A$ is to compute $A^{T}A$. Then the singular values are the square root of the eigenvalues of $A^{T}A$. The matrix $A^{T}A$ is a symmetric matrix for sure. The eigenvalues of symmetric matrices are always real. But why are the eigenvalues (or the singular values) in this case always non-negative as well?


I'm assuming that the matrix $A$ has real entries, or else you should be considering $A^*A$ instead.

If $A$ has real entries then $A^TA$ is positive semidefinite, since $$ \langle A^TAv,v\rangle=\langle Av,Av\rangle\geq 0$$ for all $v$. Therefore the eigenvalues of $A^TA$ are non-negative.


Assume that $A$ is real for simplicity. The set of (orthogonal,diagonal,orthognal) matrices $(U, \Sigma, V)$ such that $A = U \Sigma V^T$ is not unique. Indeed, if $A = U \Sigma V^T$ then also

$$ A = (-U)(-\Sigma)V^T = U (-\Sigma)(-V^T) = (UD_1)(D_1 \Sigma D_2)(V D_2)^T$$ for any diagonal matrices $D_1$ and $D_2$ with only $1$ or $-1$ on the diagonal. Therefore, the positivity of the singular values is purely conventional.


Suppose $T \in \mathcal{L}(V)$, i.e., $T$ is a linear operator on the vector space $V$. Then the singular values of $T$ are the eigenvalues of the positive operator $\sqrt{T^* \; T}$. The eigenvalues of a positive operator are non-negative.

  • Why is $\sqrt{T^* \; T}$ a positive operator? Consider $S = T^* \; T$. Then $S^* = (T^* \; T)^* = T^*\;(T^*)^* = T^*\;T=S$, and hence $S$ is self-adjoint. Also, $\langle Sv, v \rangle = \langle T^*\,T v, v \rangle = \langle Tv, Tv \rangle \geq 0$ for every $v \in V$. Hence $S$ is positive. Now every positive operator has a unique positive square root, which, for $S$, I am denoting with $\sqrt{T^* \; T}$.
  • Why are the eigenvalues of a positive operator non-negative? If $S$ is a positive operator, then $ 0 \leq \langle Sv, v \rangle = \langle \lambda v, v \rangle = \lambda \langle v, v \rangle$, and thus $\lambda$ is non-negative.

I think your question is very interesting. Let us take some, non zero singular value $\sigma_i$. We can reverse the sign if it is positive. That is, $-\sigma_i = - \sqrt{\lambda_i^2}=-\lambda_i$ where $\lambda_i^2$ is an eigenvalue of $A^T A$ corresponding to an eigenvector $v_i$. That is $A^T A v_i = \lambda_i^2 v_i$. Who can stop us to write instead $A^T A (-v_i) = \lambda_i^2 (-v_i)$? What this means is that we can reverse the sign of a singular value, but then we need to go to the matrix $V$ and reverse the sign of its corresponding eigenvector column.

Hence, there is not a unique way to write $A=U \Sigma V^T$. But if we decide that all $\sigma_i$ are non-negative, then "yes" there is a unique way to write $A=U \Sigma V^T$. Of course all $\sigma_i$ are sorted from largest to smallest (otherwise there would be a bunch of possibilities by permuting any two columns of $U$ and $V$ and their corresponding eigenvalues.)