Proof for '$AB = I$ then $BA = I$' without Motivation?

Here is a sketch of a "brutal proof" of the sort you imagine. (On rereading the other answers, you've already seen this. But the notes at the end may be interesting.) A more detailed version of the following can be found in "Bijective Matrix Algebra", by Loehr and Mendes.

Remember that the adjugate matrix, $\mathrm{Ad}(A)$, is defined to have $(i,j)$ entry equal to $(-1)^{i+j}$ times the determinant of the $(n-1) \times (n-1)$ matrix obtained by deleting the $i$-th row and $j$-th column from $A$.

Write down a brute force proof of the identity: $$\mathrm{Ad}(A) \cdot A = A \cdot \mathrm{Ad}(A) = \det A \cdot \mathrm{Id}_n$$ by grouping like terms on both sides and rearranging.

Likewise, write down a brute force proof of the identity $$\det(AB) = \det(A) \det(B).$$ So, if $AB=\mathrm{Id}$, you know that $\det(A) \det(B)=1$.

Now compute $\mathrm{Ad}(A) ABA$ in two ways: $$(\mathrm{Ad}(A) A)BA = \det(A) BA$$ and $$\mathrm{Ad}(A) (AB) A = \mathrm{Ad}(A) A = \det(A) \mathrm{Id}.$$ Since $\det(A) \det(B) =1$, you also have $\det(B) \det(A)=1$, and get to deduce that $BA = \mathrm{Id}$.

There is some interesting math here. Let $R$ be the polynomial ring in $2n^2$ variables $x_{ij}$ and $y_{ij}$ and let $X$ and $Y$ be the $n \times n$ matrices with these entries. Let $C_{ij}$ be the entries of the matrix $XY-\mathrm{Id}$ and let $D_{ij}$ be the entires of the matrix $YX-\mathrm{Id}$. Tracing through the above proof (if your subproofs are brutal enough) should give you identities of the form $D_{ij} = \sum_{k,\ell} P_{k \ell}^{ij} C_{ij}$. It's an interesting question how simple, either in terms of degree or circuit length, the polynomials $P_{k \ell}^{ij}$, can be made.

I blogged about this question and learned about some relevant papers (1 2), which I must admit I don't fully understand.


I started to wonder if there's any "brutal" proof that does not visit the "higher" domain of algebraic structures and just uses the simple componentwise algebraic operations to prove that the dot product of the i-th row of B and the j-th column of A equals to the Kronecker delta from the given condition.

Should we think a matix as more than a mere 'number box' to show BA=I?

It sounds a little bit like you're suggesting that merely having $A$ and $B$ and the formal multiplications of elements in the matrix equation $AB=I$ is enough to show that $BA=I$.

But the truth of this claim depends on properties of the ring that the matrix entries comes from!

So no, the matrices and their multiplication alone are not sufficient to prove the statement $AB=I\implies BA=I$.


In several places on math.SE, there is an example of a ring $R$ with two elements $a,b$ such that $ab=1$ and $ba\neq 1$, and that gives an example of a matrix with $n=1$ where the statement isn't true.

It takes about as long to give the example as to find an appropriate link, so for completeness I'll include it. Take $R$ to be the ring of linear transformations of the vector space of polynomials $\Bbb R[x]$, and take $b$ to be the derivative operator on $\Bbb R[x]$, and $a$ to be the antiderivative with constant set to $0$. It's easy to see that $ab=I$, but $ba\neq I$ because $ba(2)=b(0)=0$.

And if you're really hankering for an example with $n>1$, just use $M_2(R)$ over this ring. Obviously $\begin{bmatrix}a&0\\0&a\end{bmatrix}$ and $\begin{bmatrix}b&0\\0&b\end{bmatrix}$ do the trick.


The proposition is true for large classes of rings, though. A ring $R$ is called stably finite if, for every $n\in \Bbb Z^+$ and $A,B\in M_n(R)$, $AB=I\implies BA=I$. Perhaps the two broadest classes of rings with this property are right Noetherian rings and commutative rings.

If all you are interested in is commutative rings, then I can think of no better solution than the $A\mathrm{Adj}(A)=\mathrm{Adj}(A)A=\det A\cdot I$ proof given already. It is essentially brute force computation with lots of determinants, all phrased in terms of the entries of the matrices. Determinants are not available (or are at least extremely complicated) in noncommutative rings, so this is why commutativity saves the day.