Fast computation/estimation of the nuclear norm of a matrix
The nuclear norm of a matrix is defined as the sum of its singular values, as given by the singular value decomposition (SVD) of the matrix itself. It is of central importance in Signal Processing and Statistics, where it is used for matrix completion and dimensionality reduction.
A question I have is whether it's possible to compute the nuclear norm of a given matrix faster than the time needed to compute the SVD. Since we don't need all the singular values but only their sum, this seem possible. Alternatively, perhaps it could be possible to approximate it with simulation methods and/or random projections.
Solution 1:
Since you asked for an approximation as well, you might find the paper "Some simple estimates for singular values of a matrix" by Liqun Qi useful. There are some nice estimates there.
However, if these are not precise for you, you might consider computing SVD with a low precision, i.e., do one or two or three iterations and then use the above estimates. Depending on the size of your matrices, this might give a nice speedup.
Since the estimates are of the form $\sigma_i \in [ a_i, b_i ]$, you will also have an estimate of error, so you can do iterations of SVD computation until the absolute error $\sum_i (b_i - a_i)$, or some of its relative counterparts, is small enough.
Apart from that, I don't think there is much to be done. SVD exists to avoid the computation of $A^*A$ when you want these eigenvalues, so such tricks would not help, unless your matrices have some nice properties you didn't mention.
Estimates and approximations are really not my area of interest, so maybe there are results newer than the above paper.
Solution 2:
This is just a guess, but since the sum of the singular values of A is the maximimum of tr(A*U) where U is orthogonal, might one be able to estimate this by computing the maximum sweeping over the givens rotations ?
The maximum, over theta of Tr( A*G(i,j,theta) is
Tr(A) + hypot(A(i,i)+A(j,j), A(i,j)-A{j,i)) - A(i,i) - A(j,j)
If this exceeds Tr(A) one could replace A with A*G(i,j,theta^) and try another i,j.
Solution 3:
A tentative answer: the nuclear norm of $A$ is the trace of $\sqrt{A^*A}$ where $A^*$ is the conjugate-transpose of $A$, and $\sqrt{\cdot}$ is the matrix square root. So provided you can calculate matrix square roots faster than singular value decompositions, this might be useful.