Is there a "trick" to calculate the sum of singular values of a matrix $A$, without actually finding them?

For example, the sum of the squared singular values is $\operatorname{trace}(A^TA)$.


Solution 1:

This is a very important current research topic with wide applications in signal/image processing and other applied math areas. In most of this applications, one doesn't have the luxury of computing the full SVD or it is needless to compute the singular vectors. In literature, the value you asked is referred to as nuclear norm. Start looking at research related to low-rank approximations which is very much related to nuclear norm. I will try to give you some feel about it. Say that $A$ is a $M\times N$ matrix and its rank is $r$, such that $r << \min(M,N)$. To give you an example, consider the following non-linear optimization problem $$\min_{U,V}~~\mathop{trace}(U^TAV) \\s.t.~~U^TU=I,~V^TV=I$$ to find partially orthogonal matrices $U$ (of dimension $M\times r$) and $V$ (of dimension $N\times r$). Partially orthogonal here means the columns are orthonormal. In fact, the true solution to this would be the so-called thin-SVD (as opposed to full-SVD) and the objective value would be the nuclear norm. If you resort to fast sub-optimal approaches, you will start getting estimates of the nuclear norm which is what you are seeking. Also, see the webpage of popular convex package CVXOPT for their method on nuclear norm approximation (also see references therein)