Are inverse matrices unique?

More generally, in any situation where the associative law holds, if some $x$ has both a left-inverse $l$ and a right inverse $r$, then $l=r$. The reason is that $l=l(xr)=(lx)r=r$. In particular, if $x$ has a $2$-sided inverse, then that's unique. On the other hand, it is entirely possible for some $x$ to have many different left-inverses if it has no right-inverse. It is also possible for some $x$ to have many right-inverses if it has no left-inverse. Both of these possibilities actually happen in the case of non-square matrices.


Specific counterexample to the non-square case: Let $A=\left(\begin{smallmatrix}1&0&0\\0&1&0\end{smallmatrix}\right)$.

Then there is a matrix $B$ such that $AB=I$, namely $B=\left(\begin{smallmatrix}1&0\\0&1\\x&y\end{smallmatrix}\right)$. Note that $x,y$ can each be anything, so this right inverse is not unique. Also note that $I$ is $2\times 2$.

On the other hand, there is no matrix $C$ such that $CA=I$, where now $I$ would have to be $3\times 3$. The reason is that the last column of $A$ is all zeroes, so the last column of $CA$ would be all zeroes as well. Thus $A$ has no left inverse at all.


If $A$, $B$ are square matrices with same inverse $C$, then $AC=CA=I$ and $BC=CB=I$. Therefore, $$ A =AI= A(CB)= (AC)B = IB = B. $$ The odd thing about matrices is this: If $A$, $B$ are $n\times n$ matrices over a field, then $AB=I$ iff $BA=I$. This is a direct consequence of the fact that the $N\times N$ matrices form a finite-dimensional linear space.


Note that $GL(n, \mathbb{F})$, the set of invertible $n\times n$ matrices over the field $\mathbb{F}$, is a group. In any group, inverses are unique, so if $a^{-1} = b^{-1}$, by taking inverses it follows that $a = b$. In particular, this applies to the group $GL(n, \mathbb{F})$.