$\operatorname{adj}(AB) = \operatorname{adj} B \operatorname{adj} A$

I thought of the following possibly simpler proof to the ones given above (inasmuch as it uses elementary linear algebra) which makes use of the Cauchy-Binet formula and importantly, works for non-invertible matrices too. In the following, $A^{ij}$ will indicate the matrix derived from some matrix $A$ after eliminating the $i$th row and the $j$th column, and $A^{i0}$ will indicate that only the $i$th row was eliminated (with all columns remaining). Similarly $A^{0j}$ will mean $A$ with the $j$th column eliminated.

For any $1\leq i,j \leq n$, $$\left(\operatorname{adj}\left(B\right)\operatorname{adj}\left(A\right)\right)_{ij}=\sum_{k=1}^{n}\left(\operatorname{adj}B\right)_{ik}\left(\operatorname{adj}A\right)_{kj}\\=\sum_k \left(-1\right)^{k+i}\det\left(B^{ki}\right)\left(-1\right)^{j+k}\det\left(A^{jk}\right)\\=\left(-1\right)^{i+j}\sum_k \det\left(A^{jk}\right)\det\left(B^{ki}\right)\\=\left(-1\right)^{i+j}\sum_k \det\left(A^{jk}B^{ki}\right)=\left(*\right)$$

But we notice that $\left(AB\right)^{ji}=A^{j0}B^{0i}$ and so by the Cauchy-Binet formula we have: $$\det\left(AB\right)^{ji}=\sum_k \det\left(A^{jk}B^{ki}\right)$$

which gives us: $$\left(*\right)=\left(-1\right)^{i+j}\det\left(AB\right)^{ji}=\left(\operatorname{adj}\left(AB\right)\right)_{ij}$$ and we are done.

I hope that this is correct and if so that it helps.


The easiest technique for dealing with the adjugate matrix is to consider the field of rational functions in $2n^2$ indeterminates $K=F(X,Y)$, where $X$ and $Y$ denote the sets of indeterminates $X_{ij}$ and $Y_{ij}$, for $1\le i,j\le n$. Here $F$ is the base field, in your case probably $\mathbb{R}$ or $\mathbb{C}$.

Then the matrices $X=[X_{ij}]$ and $Y=[Y_{ij}]$ with coefficients in $K$ are invertible, because they have nonzero determinant. By general rule, $$\def\adj{\operatorname{adj}} (\det X) X^{-1}=\adj X $$ and similarly for $Y$ and $XY$. Thus $$ \adj(YX)=\det(XY)\cdot(YX)^{-1}=(\det X\cdot\det Y)X^{-1}Y^{-1} $$ while $$ (\det X)X^{-1}\cdot(\det Y)Y^{-1}=\adj X\cdot\adj Y. $$ Comparing the two expressions we get $$ \adj X\cdot\adj Y=\adj(YX). $$ Now these matrices have coefficients in $F[X,Y]$, the ring of polynomials in the $2n^2$ indeterminates above. Substituting the coefficients of $A$ for $X_{ij}$ and those of $B$ for $Y_{ij}$ gives your claim: $$ \adj A\cdot \adj B = \adj(BA). $$


In addition to DonAntonio's answer, if you want just something to do with matrices, you can go through this

We know that $A~ Adj A = |A|~ I $ ( how ?)

If $A$ and $B$ are matrices of the same order, then :

$=> (AB)~ adj (AB) = |A|~|B|~I$

$=> (AB)~ adj (AB) = |A|~I~|B|~I$

$=> (AB)~ adj (AB) = A~(adj~ A)~|B|~I$

$=> B~(adj~AB) = (adj A)~|B|~I $

$=> B~(adj~AB) = (adj A)~|B|~I $

Can you work out from here?


Another answer here:

Consider $(A+tI)^*, t\in R$. Every element of $(A+tI)^*$ is a polynomial in $t$, thus a continuous function of $t$. Thus we have $\lim_{t\to 0}(A+tI)^*=A^*,$ which implies that the Taylor series of $(A+tI)^*$ is $$ (A+tI)^*= A^*+ tC, t\in R, $$ for some matrix C. Similarly, there are some matrices $D,E$ such that $$ (B+tI)^*= B^*+ tD, t\in R, $$ $$ [(A+tI)(B+tI)]^*= (AB)^*+ tE, t\in R. $$

Now, note that there is $t_0>0$ such that both $A+tI$ and $B+tI$ are invertible if $t\in (t_0, \infty)$. Thus it is easy to show that
$$ [(A+tI)(B+tI)]^*= (B+tI)^* (A+tI)^*, t \in (t_0, \infty). $$ We have obtained $$(AB)^*+ tE= (B^*+ tD)(A^*+ tC)= B^*A^* +tF, t \in (t_0, \infty), $$ where $F= DA^* +B^*C + t DC$. Thus they should be equal on $t\in R$. Let $t=0$, we get what we want.