Why is it true that $\mathrm{adj}(A)A = \det(A) \cdot I$?

This is a statement in linear algebra that I can't seem to understand the proof behind.

For a square matrix $A$, why is: $$\mathrm{adj}(A)A = \det(A) \cdot I$$

Any explanation would be greatly appreciated.


Solution 1:

There is a nice short explanation. You probably know that the determinant of any matrix can be expanded row or column-wise using the minors:

$$\det A=\sum_{i=1}^n a_{ij}(-1)^{i+j}\det A(i\mid j)$$

or

$$\det A=\sum_{j=1}^n a_{ij}(-1)^{i+j}\det A(i\mid j)$$

for any $j$ (resp. $i$) or your liking where we obtain $A(i\mid j)$ by striking out the $i$-th row and the $j$-th column.

We define the coefficients of the adjoint $\hat A$ by $$(\hat A)_{ij}=(-1)^{{i+j}}\det A(j\mid i)$$

Now, upon matrix multiplication, we have

$$(A\cdot\hat A)_{k\ell}=\sum_{i=1}^n (-1)^{i+\ell}a_{ki} \det A(\ell\mid i)$$

If $k=\ell$, then $$(A\cdot \hat A)_{\ell\ell}=\sum_{i=1}^n(-1)^{i+\ell}a_{\ell i}\det A(\ell \mid i)=\det A$$ since we're expanding the determinant through the $\ell$-th row.

If $k\neq \ell$

$$(A\cdot\hat A)_{k\ell}=\sum_{i=1}^n (-1)^{i+\ell}a_{ki} \det A(\ell\mid i)=0$$

for it is the expansion of the determinant of the matrix $A^{k\ell}$ defined by $$(A^{k\ell})=\begin{cases} a_{ij} \text{ if }i\neq \ell\\a_{kj}\text{ if }i=\ell\end{cases}$$ which has two equal rows.

Solution 2:

Another way of stating Cramer's Rule. See e.g. Wikipedia.

Solution 3:

Let $A=[\alpha_{ij}]$ be matrix of order $n$ on field $K$ and $s,r \in \{1,\dots,n\}$, $n>1,$ fixed elements. Let $\hat{A}_{rs}=(-1)^{r+s} \det A_{rs}$ . Then $$\det A = \alpha_{1s}\hat{A}_{1s}+\alpha_{2s}\hat{A}_{2s}+\dots+\alpha_{ns}\hat{A}_{ns}$$ and similary $$\det A = \alpha_{r1}\hat{A}_{r1}+\alpha_{r2}\hat{A}_{r2}+\dots+\alpha_{rn}\hat{A}_{rn}\tag{1}$$ Then we find that $$\alpha_{i1}\hat{A}_{r1}+\dots+\alpha_{in}\hat{A}_{rn}=\begin{cases} 0, &\text{ for } i \neq r,\\ \det A,& \text{ for } i=r.\end{cases}\tag{2}$$

And similary for $\alpha_{1i}\hat{A}_{1s}+\dots+\alpha_{ni}\hat{A}_{ns}=0 \quad (i \neq s).$

Now conclude that for every matrix of order $n$ on field $K$ it belongs one matrix

$$\operatorname{adj}A = \begin{bmatrix} \hat{A}_{11} & \cdots & \hat{A}_{n1} \\ \vdots & \ddots & \vdots \\ \hat{A}_{1n} & \cdots & \hat{A}_{nn} \end{bmatrix}$$ from $M_n(K)$, such that $$A \cdot \operatorname{adj}A = \det A \cdot I$$

Edit (explanation for $(2)$): From $(1)$, left side of $(2)$ is determinant of matrix $B$ which arise from matrix $A$ when we change $r$-th row its $i$-th row. If $i \neq r$ then that matrix have two same rows $(B_{r\to}=A_{i\to}=B_{i\to}$), so its determinant is zero.