What is inverse of $I+A$?

Assume $A$ is a square invertible matrix and we have $A^{-1}$. If we know that $I+A$ is also invertible, do we have a close form for $(I+A)^{-1}$ in terms of $A^{-1}$ and $A$?

Does it make it any easier if we know that sum of all rows are equal?


I know I'am a bit late, but I wanted to add the following for future readers. If you know the Eigendecomposition of your matrix $A = Q \Lambda Q^{-1}$ the result is the following: \begin{equation} \left( A + I\right)^{-1} =\left( Q \Lambda Q^{-1} + I\right)^{-1} =\left( Q \left(\Lambda + I \right) Q^{-1}\right)^{-1} = Q \left(\Lambda + I \right)^{-1} Q^{-1} \end{equation} I know that this is a very trivial calculation and the result is maybe not helpful to many of you, but I wasted countless hours using the Shermann-Morrison-Theorem because I forgot about the Eigendecomposition. Cheers, Lukas


If the series $S = I - A + A^2 - A^3 + \dotsb$ converges, expand $S(I + A)$ to find that it is equal to the identity. See here for details on when $S$ converges.

$S$ clearly converges if $A^k = 0$ for some positive integer $k$ (nilpotency).

As for invertibility, if $A$ is diagonalizable, i.e. $A = PDP^{-1}$ for some diagonal matrix $D = \operatorname{diag}(e_1, e_2, \dotsc, e_n)$, then by Sylvester's Determinant Theorem, $$ \det(I + (PD)P^{-1}) = \det(P^{-1}(PD) + I) = \prod_{i = 1}^n(e_i + 1) $$

Hence, $I + A$ is invertible if no eigenvalue $e_i$ has a value of $-1$.

If $A$ is nilpotent, then its only eigenvalue is $0$, so $I + A$ is invertible.

Is there a closed-form solution?

I believe not. Basically, a closed-form expression of $(I + A)^{-1}$ using $A$ and $A^{-1}$ would amount to a closed-form expression of $(1 + x)^{-1}$ using $x$ and $x^{-1}$, where $x$ is real (or complex). A semi-rigorous articulation of this argument follows:

Proposition: There exists no family of matrices $\{X_{ij}\}_{m \times n}$, where every $X_{ij}$ is either equal to $A$, $A^{-1}$ or a constant dependent on the dimension of $A$, such that $(I + A)^{-1} = \sum_{i = 1}^m(\prod_{j = 1}^nX_{ij})$ for all values of $A$.

Proof:

Assume there exists such a family. Let $A$ be the $1 \times 1$ matrix $x$. Note that $\sum_{i = 1}^m(\prod_{j = 1}^nX_{ij})$ is a polynomial $P(x)$, which apparently equals $(1 + x)^{-1}$ for all values of $x$. Hence, $P(x)$ must be the taylor series $1 - x + x^2 - x^3 + \dotsb$, which contradicts the fact that $P(x)$ has a finite number of terms.


Check this question. The first answer presents a recursive formula to retrieve the inverse of a generic sum of matrices. So yours should be a special case.