Cholesky decomposition in positive semi-definite matrix

Solution 1:

If $\mathbf{A}$ is semi-positive, we have that the sequence $\{\mathbf{A}_k\} = \{\mathbf{A}+\frac{1}{k}\mathbf{I}\}$ consists of positive matrices1 . What's more, $\mathbf{A}_k \to \mathbf{A}$ in operator norm, and with $\mathbf{A}_k = \mathbf{L}_k\mathbf{L}_k^T$, we have that $\mathbf{L}_k$ converges to a limit we name $\mathbf{L}$. Finally, $\mathbf{L}$ has the desired properties, that is $\mathbf{A} = \mathbf{L}\mathbf{L}^T$.

Therefore, an efficient algorithm would be to add a small diagonal $\varepsilon \mathbf{I}$ to $\mathbf{A}$ so that $\mathbf{A} + \varepsilon \mathbf{I}$ is numerically positive definite, and then perform the Cholesky decomposition. We finally remark that computing the eigenvalues of $\mathbf{A}$ asks for more floating point operations.


1proof sketch on wikipedia

Solution 2:

I ran into a similar problem while implementing an algorithm from following paper.

"Online and Batch Learning of Pseudo-Metrics" Shai Shalev-Shwartz, Yoram Singer and Andrew Y. Ng, ICML 2004.

In this case also the algorithm guarantees that the matrix being computed will be positive definite. To me, the issue seems to be computational. So I decided to find the nearest matrix which will allow me to continue the computation. Following paper outlines how this can be done.

"Computing a nearest symmetric positive semidefinite matrix," Nicholas J. Higham, Linear Algebra and its Applications, Volume 103, May 1988, Pages 103-118

  1. Symmetric part of $A$ given by $B=(A+A^T)/2$
  2. Eigendecomposition of $B$ gives eigenvectors $Z$ and eigen values $\lambda_i$.
  3. Positive approximant of $A$ is given as $Z~\text{diag}(\text{max}(0,\lambda_i))Z^T$

Equivalent Matlab code:

 [Z,L] = eig(0.5*(A+A'));
 A_out = Z*max(L,0)*Z';

Code available at following website which solves similar problem uses same technique as well.

OASIS: Large scale online Learning of Image Similarity through ranking