Inverse of the sum of a invertible matrix with known Cholesky-decomposion and diagonal matrix

At the present time there is no known algorithm for efficiently performing high rank diagonal updates to Cholesky or LU factorizations, even in the case where the update matrix is a multiple of the identity. Such an algorithm is highly desirable in a wide variety of applications, and if one were discovered it would be a major breakthrough in numerical linear algebra. The following related math.stackexchange and scicomp.stackexchange threads are worth looking into:

  • Cholesky of Matrix plus Identity

  • Can diagonal plus fixed symmetric linear systems be solved in quadratic time after precomputation?

as well as the following links noted by Kirill in the comments of the above note math.stackexchange thread: [1], [2], [3], [4], [5].

However, if you are willing to consider other types of matrix decompositions such as the (generalized) eigenvalue decomposition, (generalized) Schur decomposition, or (generalized) singular value decomposition, then there are efficient algorithms for performing updates based on precomputation, as long as the update is of the form: $$M \rightarrow M + d B_0,$$ where $B_0$ is a general fixed matrix that can be involved with the precomputation, and $d$ is a scalar that is not known at the precomputation stage, but rather is updated on the fly. Efficient update algorithms for the case where the matrix $B_0$ changes are not currently known (even in the diagonal case).

It turns out that there is no essential difference if the update matrix $B_0$ is diagonal or not, though it does matter if it is the identity. Here I mention and summarize the results, then below discuss each case in more detail.

  • Updates for symmetric $M$ and $B_0$ can be done efficiently after precomputing an eigenvalue decomposition, whereas in the nonsymmetric case the Schur decomposition must be used.

  • If $B_0$ is the identity one can use the standard versions of the decompositions listed above, whereas if $B_0$ is not the identity, the generalized versions are required.

  • For situations where the matrices naturally arise in the form $M=A^TA$ and $B_0=R^TR$ (e.g., updates to a regularization parameter in regularized least squares problems), one can work directly with the factors $A$ and $R$ by precomputing a generalized SVD decomposition, thereby never forming the squared systems, which could be much larger if the factor matrices are rectangular.

  • Finally, if the update is low-rank (e.g., $B_0$ is diagonal but only contains a few nonzero diagonal elements), one can perform updates to a solver based on any factorization (LU, Cholesky, whatever) with the Woodbury formula.

A summary of which decompositions can be used for certain cases is shown in the following tables. The numbers reference more detailed discussion below.

$$\begin{array}{c|c|c} & \text{update }= d I & \text{update} = d B_0\\ \hline M \text{ and } B_0 \text{ symmetric} & \text{eigenvalue decomposition}~(1.) & \text{generalized eigenvalue decomposition}~(2.)\\ \hline M \text{ and/or } B_0 \text{ nonsymmetric}& \text{Schur decomposition}~(3.) & \text{generalized Schur decomposition}~(4.) \end{array}$$ and $$\begin{array}{c|c} M=A^TA ~\text{ and } ~B_0=R^TR & \text{generalized SVD}~(5.)\\ \hline B_0 \text{ is low rank} & \text{Woodbury formula}~(6.) \end{array}$$

Details for specific cases:

  1. (Symmetric $M$, $B_0=I$) Let $QDQ^T$ be the eigenvalue decomposition of $M$. The inverse of the updated version can be written as: $$(M + dI)^{-1} = Q(D + dI)^{-1}Q^T.$$

  2. (Symmetric $M$ and $B_0$) Let $$B_0 U = M U \Lambda$$ be the factorization associated with the generalized eigenvalue problem for $B_0$ and $M$. It turns out (see link in previous sentence) that this $U$ simultaneously diagonalizes $M$ and $B_0$, in the sense that $U^T B_0 U = \Lambda$ and $U^T M U = I$, so you can write $$M+dB_0 = U^{-T}U^T(M + d B_0)UU^{-1} = U^{-T}(I + d \Lambda)U^{-1}.$$ The inverse of the updated matrix is then: $$(M+dB_0)^{-1} = U (I + d \Lambda)^{-1} U^T.$$

  3. (Nonsymmetric $M$, $B_0=I$) Use the Schur decomposition as described in Jack Poulson's answer on scicomp.

  4. (Nonsymmetric $M$ and $B_0$) Let $$M=Q S Z^T, \quad B_0 = Q T Z^T$$ be the generalized Schur decomposition of $M$ and $B_0$ (also sometimes referred to as the QZ decomposition). Here $Q,Z$ are orthogonal, and $S,T$ are upper triangular. Then the update takes the form, $$M + d B_0 = Q(S + d T)Z^T,$$ with the inverse being: $$(M + d B_0)^{-1} = Z(S + d T)^{-1}Q^T.$$ Since the sum of upper triangular matrices is upper triangular, one can perform solves for such an updated system by triangular backsubstitution.

  5. ($M=A^TA$ and $B_0=R^TR$) Use the generalized SVD. The way to do this for matrix updates is described as an example in Section 3 of Van Loan's original paper:

Van Loan, Charles F. "Generalizing the singular value decomposition." SIAM Journal on Numerical Analysis 13.1 (1976): 76-83.

  1. ($B_0$ is low rank) Use the Woodbury formula.

I'm not sure how to use Cholesky here but here is a method that avoids inverting non-diagonal matrices.

  1. First write (skip this step for $D=dI$), $$(M+D)^{-1}=D^{-1}(MD^{-1}+I)^{-1}$$
  2. Now use the fact that real symmetric matrices are (orthogonally) diagonalizable, and so you can find $MD^{-1}=Q\Lambda Q^T$.
  3. Then,

$$=D^{-1}(Q\Lambda Q^T+I)^{-1}=D^{-1}(Q(\Lambda+I)Q^T)^{-1}=D^{-1}Q(\Lambda+I)^{-1}Q^T.$$