Deriving the inverse of $\mathbf{I}$+Idempotent matrix

Suppose I have an idempotent matrix such that $A^2=A$. From its properties, if $A$ is not an identity matrix, then it is singular. Through trial and error, I can see that for all $I+A$ are invertible.

But how can I show that $I+A$ is indeed invertible and then to find out its inverse? I searched on this and found out that the $(I+A)^{-1}=\frac{1}{2}(2I-A)$ so it must be through that $I+A$ has an inverse but how was this derived?


If $A$ is a diagonalizable invertible matrix, you can use Lagrange's Interpolation Formula to invert it. More precisely, if the distinct eigenvalues are $\lambda_1,\dots,\lambda_k$, then we have $$A^{-1}=\sum_{i=1}^k\ \frac{1}{\lambda_i}\ \prod_{j\not=i}\ \frac{A-\lambda_jI}{\lambda_i-\lambda_j}\quad.$$ More generally, the inverse of an invertible matrix $A$ is a polynomial in $A$, which depends only on the minimal polynomial of $A$, and which is given by a simple formula. If you're interested, I'll be happy give you more details.

EDIT 1. @rcollyer kindly asked for more details. Here they are.

Let $A$ be a complex invertible square matrix with minimal polynomial $f\in\mathbb C[X]$, and let $$f=(X-\lambda_1)^{m(1)}\cdots(X-\lambda_k)^{m(k)}$$ be the factorization of $f$, where the $\lambda_j$ are the distinct eigenvalues of $A$.

There is a unique polynomial $g$ of degree less than the degree $d$ of $f$ such that $g(A)=A^{-1}$. Moreover $g$ is given by the following recipe.

For any rational fraction $\varphi\in\mathbb C(X)$ defined at $\lambda_j$, let $T_j(\varphi)$ be the degree less than $m(j)$ Taylor approximation of $\varphi$ at $X=\lambda_j$.

Put $g_j:=T_j(1/X)$.

Then $g$ is the unique degree less than $d$ solution to the congruences $$g\equiv g_j\bmod(X-\lambda_j)^{m(j)},\quad 1\le j\le k.$$ More precisely, $g$ is is given by $$g=\sum_{j=1}^k\ T_j\!\!\!\!\left(g_j\ \frac{(X-\lambda_j)^{m(j)}}{f}\right)\ \frac{f}{(X-\lambda_j)^{m(j)}}\quad.$$

Again, I'll be happy to offer any further explanation I can give.

EDIT 2. How to prove the above claims? There are three ingredients:

(1) The canonical epimorphism $\mathbb C[X]\twoheadrightarrow\mathbb C[A]$ induces an isomorphism $\mathbb C[X]/(f)\overset\sim\to\mathbb C[A]$.

(2) By the Chinese Remainder Theorem, the natural morphism from $\mathbb C[X]/(f)$ to the product of the $\mathbb C[X]/(X-\lambda_j)^{m(j)}$ is an isomorphism.

(3) Taylor's formula enables one to invert the above isomorphism.

I'll just say a few words about (3). Suppose for simplicity $k=3$. So, we want to solve $$g\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ $$g\equiv g_2\bmod(X-\lambda_2)^{m(2)},$$ $$g\equiv g_3\bmod(X-\lambda_3)^{m(3)}.$$ Suppose we can solve the system $$h_1\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ $$h_1\equiv 0\bmod(X-\lambda_2)^{m(2)},$$ $$h_1\equiv 0\bmod(X-\lambda_3)^{m(3)};$$ the system $$h_2\equiv 0\bmod(X-\lambda_1)^{m(1)},$$ $$h_2\equiv g_2\bmod(X-\lambda_2)^{m(2)},$$ $$h_2\equiv 0\bmod(X-\lambda_3)^{m(3)};$$ and the system $$h_3\equiv 0\bmod(X-\lambda_1)^{m(1)},$$ $$h_3\equiv 0\bmod(X-\lambda_2)^{m(2)},$$ $$h_3\equiv g_3\bmod(X-\lambda_3)^{m(3)}.$$ Then we'll just set $g:=h_1+h_2+h_3$. How to solve the system for $h_1$? The last two equations tell us that $h_1$ will be of the form $(X-\lambda_2)^{m(2)}(X-\lambda_3)^{m(3)}u$, and we only have to solve $$(X-\lambda_2)^{m(2)}\ (X-\lambda_3)^{m(3)}\ u\equiv g_1\bmod(X-\lambda_1)^{m(1)},$$ which we can write as $$T_1\Big((X-\lambda_2)^{m(2)}\ (X-\lambda_3)^{m(3)}\ u\Big)=T_1(g_1),$$ where $T_1(?)$ is the degree less than $m(1)$ Taylor approximation of ? at $X=\lambda_1$. This give $$T_1(u)=T_1\!\!\left(\frac{g_1}{(X-\lambda_2)^{m(2)}(X-\lambda_3)^{m(3)}}\right),$$ whence the formula.

EDIT 3. The general formula appears in the entry Hermite interpolation formula of the Encyclopaedia of Mathematics, edited by Michiel Hazewinkel.


If a square $n \times n$ matrix $M$ has nonzero determinant then by the Cayley-Hamilton theorem $M^{-1}$ exists and is a polynomial in $M$ of degree $\leq (n-1)$.

Here $M = I + A$ and $n=2$. The inverse, if it exists, is a linear function of $A$. It does exist by the eigenvalue argument in the comments. The coefficients of the linear function can be found by various means such as specializing $A$ to $0$ or $I$, or solving for $x$ and $y$ such that $(I+A)(xI+yA)=I$.

The eigenvalue argument and the formula for the inverse assume that the matrix entries are taken from a field (or ring) where division by $2$ is possible. In characteristic 2 the statement is false; if $2=0$ then $A=I$ is idempotent but $I+A=0$.

Another method is to write the idempotence of $A$ as a condition on $M$:

$(M-I)^2 = (M-I)$ or $M^2 - 3M + 2I = 0$ and this is the same as $M(3I - M)=2I$. The inverse of $M$ is therefore $(3I - M)/2 = (3I - (I+A))/2 = I - A/2$.