I have computed the Cholesky of a positive semidifinite matrix $\Theta$. However I wish to know the diagonal elements of the inverse of $\Theta^{-1}_{ii}$. Is it possible to do this using the cholesky that I have computed? Or will finding the Eigen values alone (without the orthonormal matrices of a SVD) help this cause.

Are there any other suggestions or alternative decompositions that will aid finding the inverse matrix diagonal?

Edit: I've seen that random projections does wonders for inverting matrices. Could something like this be applied here?


Solution 1:

I stumbled onto this question when trying to answer a similar question

I want a diagonal matrix that best approximates the inverse of a matrix $\textbf{B} \succ 0$.

I'll post my answer to that question in case it helps other (and maybe OP). In this case, "best" means nearest in the $\ell_2$ sense.

$$\textbf{d}^*(\textbf{B}) = {argmin}_{\textbf{d}} \tfrac 1 2 \| \textbf{B} diag(\textbf{d}) - \textbf{I} \|_F^2$$

This is separable in $d_i$ and differentiable. Setting the gradient to zero brings us to the closed form (and very cheap) solution

$[\textbf{d}^*]_i = \frac {b_{ii} } { \| \textbf{b}_i \|^2}$

(Note in complex numbers, you'd need to conjugate)

I wouldn't be surprised if this has been known for 100 years, but I couldn't easily find it.

Solution 2:

If you have the Cholesky decomposition, you can easily compute the whole matrix inverse. Since

$$\Theta = R^* R$$

where $R$ is upper-triangular, then you can find $\Theta^{-1}$ by solving

$$R^* R X = I$$

where $I$ is the identity. The latter system can be solved by forward and backward substitution.

If you only want the diagonal entries of $X$, you could save perhaps half the computation by stopping the backward substitution process (for each column) when you get to the diagonal entry.

Solution 3:

Tang and Saad have a method that uses random vectors (not necessarily projections):

A Probing Method for Computing the Diagonal of the Matrix Inverse