Sylvester's determinant identity

Hint $\ $ Work universally, i.e. consider the matrix entries as indeterminates $\rm\,a_{\,i\,j},b_{\,i\,j}.\,$ Adjoin them all to $\,\Bbb Z\,$ to get the polynomial ring $\rm\ R = \mathbb Z[a_{\,i\,j},b_{\,i\,j}\,].\, $ Now, $ $ in $\rm\,R,\,$ compute the determinant of $\rm\ (1+A B)\, A = A\, (1+BA)\ $ then cancel $\rm\ det(A)\ \ $ (which is valid because $\,\rm R\,$ is a domain). $\ \ $ Extend to non-square matrices by padding appropriately with $0$'s and $1$'s to get square matrices. Note that the proof is purely algebraic - it does not require any topological notions (e.g. density).

Alternatively, one may proceed by way of Schur decomposition, namely

$$\rm\left[ \begin{array}{ccc} 1 & \rm A \\ \rm B & 1 \end{array} \right]\ =\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm 0 & \rm 1-BA \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]$$

$$\rm\phantom{\left[ \begin{array}{ccc} 1 & \rm B \\ \rm A & 1 \end{array} \right]}\ =\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]\ \left[ \begin{array}{ccc} \rm 1-AB & \rm 0 \\ \rm 0 & \rm 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]$$

See this answer for more on universality of polynomial identities and relation topics, and see also this sci.math thread on 9 Nov 2007.


(1) Start, for fun, with a silly proof for square matrices:

If $A$ is invertible, then $$ \det(I+AB)=\det A^{-1}\cdot\det(I+AB)\cdot\det A=\det(A^{-1}\cdot(I+AB)\cdot A)=\det(I+BA). $$ Now, in general, both $\det(I+AB)$ and $\det(I+BA)$ are continuous functions of $A$, and equal on the dense set where $A$ is invertible, so they are everywhere equal.

(1) Now, more seriously:

$$ \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix}\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&0\\\\A&AB+I\end{pmatrix} =\det(I+AB) $$

and

$$ \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I+BA&0\\\\A&I\end{pmatrix} =\det(I+BA) $$

Since the leftmost members of these two equalities are equal, we get the equality you want.


We will calculate $\det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix}$ in two different ways. We have $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m & 0 \\ B & I_n + BA \end{pmatrix} = \det(I_n + BA). $$ On the other hand, $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m+AB & 0 \\ B & I_n \end{pmatrix} = \det(I_m + AB). $$