Is $U=V$ in the SVD of a symmetric positive semidefinite matrix?
Consider the SVD of matrix $A$:
$$A = U \Sigma V^\top$$
If $A$ is a symmetric, positive semidefinite real matrix, is there a guarantee that $U = V$?
Second question (out of curiosity): what is the minimum necessary condition for $U = V$?
First of all see that $U$ and $V$ are not unique in general. However you might be able to find a relation between distinct SVD of a matrix $A$ and working with real matrix makes things easier.
For a general real $A$, let singular values of $A$ be non-zero. If $A=U_1\Sigma V_1^T$ and $A=U_2\Sigma V_2^T$ then from this link, there is a diagonal matrix $D=\mathrm{diag}(\pm 1,\dots,\pm 1)$ such that: $$ U_1=U_2D, V_1=V_2D. $$ Now suppose that $A$ is a normal matrix with positive eigenvalues. It can be orthogonally diagonalized. Then we can see that: $$ A=UDU^{T} $$ This is a SVD of $A$. So for $A=U_1\Sigma V_1^T$ then $U_1=UD$ and $V_1=UD$ which implies that $U_1=V_1$. In other words having normal matrix with positive eigenvalues is sufficient for having $U=V$. This class includes positive definite matrices. When zero singular values are permitted, then the situation is more tricky. Take zero matrix for instance.
Here is an attempt to provide a clear answer, building upon Arash's answer.
Primer:
-
Any matrix $A$ can be decomposed with Singular Value Decomposition (SVD) as $A = U \Sigma V^\top$. $U$ and $V$ are unitary matrices. This decomposition is not unique: the singular values part $\Sigma$ is unique ; however the signs in the left and right singular vectors can be interchanged. Besides when at least one singular value is zero, there are many possible corresponding singular vectors. The following hold (source):
- the singular values are equal to the square roots of the eigenvalues of $AA^\top$ (or the ones of $A^\top A$) (resp. $AA^*$ or $A^*A$ for complex matrices)
- the right singular vectors (columns of $V$) are eigenvectors of $A^\top A$ (resp. $A^*A$)
- the left singular vectors (columns of $U$) are eigenvectors of $AA^\top$ (resp. $AA^*$)
-
if $A$ is real symmetric then (spectral theorem) it is diagonalizable and therefore has at least one eigendecomposition $A = Q \Lambda Q^{-1} = Q \Lambda Q^\top $. (this post shows a non-diagonalizable counterexample of a complex symmetric matrix). In general this decomposition is not unique: the eigenvalues part $\Lambda$ is unique ; however the eigenvectors part $Q$ is only unique if no eigenvalue is zero.
-
so, if $A$ is real symmetric
- its singular values are the absolute values (modulus if complex) of its eigenvalues.
- both the right and left singular vectors (columns of $V$ and $U$) are eigenvectors of $A^\top A = AA^\top = A^2 = Q \Lambda^{2} Q^{-1}$, so they are both eigenvectors of $A$. Also, remember that they are unit vectors: so they are either equal to vectors in $Q$ or to $-1$ times these vectors.
Now to translate this in an answer to your question:
-
if $A$ is real symmetric and positive definite (i.e. all of its eigenvalues are strictly positive), $\Sigma$ is a diagonal matrix containing the eigenvalues, and $U=V$.
-
if $A$ is real symmetric and only semi-positive definite (i.e. all of its eigenvalues are positive but some of its eigenvalues can be zero), $\Sigma$ is a diagonal matrix containing the eigenvalues, but there is no guarantee that $U=V$. Indeed the part of $U$ and $V$ corresponding to the zero eigenvalues can be any orthonormal decomposition of the null space of $A$, with sign flips allowed independently on $U$ and $V$.
-
if $A$ is only real symmetric and not semi-positive definite (i.e. some of its eigenvalues can be negative), then $\Sigma$ is a diagonal matrix containing the absolute values of the eigenvalues. There are then two reasons for there being no guarantee that $U=V$. If there is a zero eigenvalue, then see previous bullet point. If there is a negative eigenvalue, then the sign "taken off" the eigenvalue in $\Lambda$ to construct the (positive by definition) $\Sigma$ to make it positive has to end up either on $U$ or $V$. For a concrete example consider a diagonal matrix with at least one negative element.
As noted by Arash you can replace in all the above statements the words "real symmetric" with "normal".
So to conclude a minimum condition for $U=V$ is to be normal and positive definite. Now is this necessary ? Is it proven that non-normal matrices can not have strictly positive eigenvalues ? This is the part I'm not sure about.
If the matrix is symmetric then $U=V$, as the by the spectral theorem we know that the eigenvalue decomposition and the singular value decomposition must be the same. From that we see that $U = U\Lambda U^{-1}=U\Lambda U^T=U\Sigma V^T$, and as by the theorem $\Sigma = \Lambda$ then $U=V$.
Note the emphasis on being positive semi-definite. If $\mathbf A$ is singular, there is no such guarantee, and $\mathbf U$ and $\mathbf V$ can be different. As @Arash said, consider zero matrix, the SVD is not unique.
However, if we consider the column space or span of $\mathbf A$, and project $\mathbf U$ and $\mathbf V$ on this space, the projected U and V are equal.
It seems non-singularity also provides the necessary condition for $\mathbf U=\mathbf V$. But I need to double check this.