When we have a symmetric matrix $A = LL^*$, we can obtain L using Cholesky decomposition of $A$ ($L^*$ is $L$ transposed).

Can anyone tell me how we can get this same $L$ using SVD or Eigen decomposition?

Thank you.


Solution 1:

I am not sure why anyone would want to obtain a Cholesky decomposition from a SVD or an eigen-decomposition, but anyway, let's say $A$ is positive definite:

  • As $A$ is positive definite, if $A=U\Sigma V^\ast$ is a SVD of $A$, we must have $U=V$ (exercise). Perform a QR decomposition for $\sqrt{\Sigma}U^\ast$, i.e. write $\sqrt{\Sigma}U^\ast=QR$ for some unitary matrix $Q$ and some upper triangular matrix $R$. Then $A=R^\ast R$ is a Cholesky decomposition of $A$.
  • If $A=PDP^{-1}$ is an eigendecomposition of $A$, perform a QR decomposition or Gram-Schmidt orthogonalization for each group of columns of $P$ that correspond to the same eigenvalue. Hence we can obtain a set of orthonormal eigenvectors of $A$, i.e. we get some unitary matrix $U$ such that $A=UDU^\ast$. So we can apply the previous method to obtain a Cholesky decomposition $A=R^\ast R$.

Solution 2:

Several people in this thread asked why you would ever want to do Cholesky on a non-positive-definite matrix. I thought I'd mention a case would motivate this question.

Cholesky decomposition is used to generate Gaussian random variants given a covariance matrix using $x_i = \sum_j L_{ij} z_j$ where each $z_j ~ Normal(0,1)$ and $L$ is the Cholesky decomposition of the covariance matrix.

A problem arises when the covariance matrix is de-generate, when the random variation described by the covariance in contained in a lower dimensional space. One or more of the Eigenvalues is zero, the matrix is not positive-definite, calls to Cholesky decomposition routines fail. When you are near this case, things also tend to be extremely sensitive to numeric round-off (i.e., the covariance is ill-conditioned).

There shouldn't be any inherent problem with generating points on this "flat" Gaussian, but the textbook algorithm based on Cholesky breaks.

Eigen decomposition can be used as an alternative for this problem, if you have a robust implementation. Some Eigen decomposition algorithms don't do well in this case either, but there are Eigen algorithms that are robust.

Solution 3:

There is an interesting relationship between the eigen-decomposition of a symmetric matrix and its Cholesky factor: Say $A = L L'$ with $L$ the Cholesky factor, and $A = E D E'$ the eigen-decompostion. Then the eigen-decompostion of $L$ is $L= E D^{\frac{1}{2}} F$, with $F$ some orthogonal matrix, i.e. the Cholesky factor is a rotated form of the matrix of eigenvectors scaled by the diagonal matrix of sqaure-root eigen-values. So you can get $L$ from $E D^{\frac{1}{2}}$ through a series of orthogonal rotations aimed at making the elements above the diagonal zero.

Solution 4:

Cholesky $A=LL^*$:

  • Pros: A little bit faster than SVD (but still O(n$^3$)), and very easy to implement
  • Cons: it can only deal with definite/semi-definite cases, so it works only on square matrix. For semi-definite case, to handle the zero diagonal entry, if $L\{i,i\}=0$, then $L\{j,i\}=0$.

SVD $A=U\Sigma V*$

  • Pros: it can deal with all kinds of matrix (not necessary to be square matrix).
  • Cons: A little bit slower than Cholesky (but still O(mn$^2$)). it requires two steps to decompose, needs QR decomposition and thus not as easy to implement as Cholesky

Solution 5:

or you use the LU decomposition.

Anyhow, you don't normally calculate the cholesky decomposition from the eigendecomposition or svd - you use gaussian elimination. See something like Matrix Computations.