Sylvester's determinant identity
Sylvester's determinant identity states that if $A$ and $B$ are matrices of sizes $m\times n$ and $n\times m$, then
$$ \det(I_m+AB) = \det(I_n+BA)$$
where $I_m$ and $I_n$ denote the $m \times m$ and $n \times n$ identity matrices, respectively.
Could you sketch a proof for me, or point to an accessible reference?
Hint $\ $ Work universally, i.e. consider the matrix entries as indeterminates $\rm\,a_{\,i\,j},b_{\,i\,j}.\,$ Adjoin them all to $\,\Bbb Z\,$ to get the polynomial ring $\rm\ R = \mathbb Z[a_{\,i\,j},b_{\,i\,j}\,].\, $ Now, $ $ in $\rm\,R,\,$ compute the determinant of $\rm\ (1+A B)\, A = A\, (1+BA)\ $ then cancel $\rm\ det(A)\ \ $ (which is valid because $\,\rm R\,$ is a domain). $\ \ $ Extend to non-square matrices by padding appropriately with $0$'s and $1$'s to get square matrices. Note that the proof is purely algebraic - it does not require any topological notions (e.g. density).
Alternatively, one may proceed by way of Schur decomposition, namely
$$\rm\left[ \begin{array}{ccc} 1 & \rm A \\ \rm B & 1 \end{array} \right]\ =\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm 0 & \rm 1-BA \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]$$
$$\rm\phantom{\left[ \begin{array}{ccc} 1 & \rm B \\ \rm A & 1 \end{array} \right]}\ =\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]\ \left[ \begin{array}{ccc} \rm 1-AB & \rm 0 \\ \rm 0 & \rm 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]$$
See this answer for more on universality of polynomial identities and relation topics, and see also this sci.math thread on 9 Nov 2007.
(1) Start, for fun, with a silly proof for square matrices:
If $A$ is invertible, then $$ \det(I+AB)=\det A^{-1}\cdot\det(I+AB)\cdot\det A=\det(A^{-1}\cdot(I+AB)\cdot A)=\det(I+BA). $$ Now, in general, both $\det(I+AB)$ and $\det(I+BA)$ are continuous functions of $A$, and equal on the dense set where $A$ is invertible, so they are everywhere equal.
(1) Now, more seriously:
$$ \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix}\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&0\\\\A&AB+I\end{pmatrix} =\det(I+AB) $$
and
$$ \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I+BA&0\\\\A&I\end{pmatrix} =\det(I+BA) $$
Since the leftmost members of these two equalities are equal, we get the equality you want.
We will calculate $\det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix}$ in two different ways. We have $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m & 0 \\ B & I_n + BA \end{pmatrix} = \det(I_n + BA). $$ On the other hand, $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m+AB & 0 \\ B & I_n \end{pmatrix} = \det(I_m + AB). $$
here is another proof of $det(1 + AB) = det(1+BA).$ We will use the fact that
the nonzero eigen values of $AB$ and $BA$ are the same and the determinant of a matrix is product of its eigenvalues. Take an eigenvalue $\lambda \neq 0$ of $AB$ and the coresponding eigenvector $x \neq 0.$ It is claimed that $y = Bx$ is an eigenvector of $BA$ corresponding to the same eignevalue $\lambda.$
For
$ABx = Ay = \lambda x \neq 0,$ therefore $y \neq 0.$ Now we compute $BAy = B(ABx) = B(\lambda x) = \lambda y.$ We are done with the proof.