23
$\begingroup$

Sylvester's determinant identity states that if $A$ and $B$ are matrices of sizes $m\times n$ and $n\times m$, then

$$ \det(I_m+AB) = \det(I_n+BA)$$

where $I_m$ and $I_n$ denote the $m \times m$ and $n \times n$ identity matrices, respectively.

Could you sketch a proof for me, or point to an accessible reference?

  • 1
    See also http://terrytao.wordpress.com/2013/01/13/matrix-identities-as-derivatives-of-determinant-identities/2013-01-13

4 Answers 4

32

Hint $\ $ Work universally, i.e. consider the matrix entries as indeterminates $\rm\,a_{\,i\,j},b_{\,i\,j}.\,$ Adjoin them all to $\,\Bbb Z\,$ to get the polynomial ring $\rm\ R = \mathbb Z[a_{\,i\,j},b_{\,i\,j}\,].\, $ Now, $ $ in $\rm\,R,\,$ compute the determinant of $\rm\ (1+A B)\, A = A\, (1+BA)\ $ then cancel $\rm\ det(A)\ \ $ (which is valid because $\,\rm R\,$ is a domain). $\ \ $ Extend to non-square matrices by padding appropriately with $0$'s and $1$'s to get square matrices. Note that the proof is purely algebraic - it does not require any topological notions (e.g. density).

Alternatively, one may proceed by way of Schur decomposition, namely

$$\rm\left[ \begin{array}{ccc} 1 & \rm A \\ \rm B & 1 \end{array} \right]\ =\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm 0 & \rm 1-BA \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]$$

$$\rm\phantom{\left[ \begin{array}{ccc} 1 & \rm B \\ \rm A & 1 \end{array} \right]}\ =\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]\ \left[ \begin{array}{ccc} \rm 1-AB & \rm 0 \\ \rm 0 & \rm 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]$$

See my posts in this sci.math thread on 09 Nov 2007 for further discussion.

  • 1
    "simply pad-up appropriately with 0's and 1's to get square matrices." Oh, I can't believe it! Very nice! Many thanks.2011-01-17
  • 2
    There is rarely *need* for anything... For example, there is no need for proofs to be *purely algebraic* :)2011-01-17
  • 0
    @Mariano: It's a shining example of the power of universal proofs - which deserves emphasis (esp. since this simple algebraic proof is often overlooked - even by some professional mathematicians).2011-01-17
19

(1) Start, for fun, with a silly proof for square matrices:

If $A$ is invertible, then $$ \det(I+AB)=\det A^{-1}\cdot\det(I+AB)\cdot\det A=\det(A^{-1}\cdot(I+AB)\cdot A)=\det(I+BA). $$ Now, in general, both $\det(I+AB)$ and $\det(I+BA)$ are continuous functions of $A$, and equal on the dense set where $A$ is invertible, so they are everywhere equal.

(1) Now, more seriously:

$$ \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix}\begin{pmatrix}I&B\\\\0&I\end{pmatrix} =\det\begin{pmatrix}I&0\\\\A&AB+I\end{pmatrix} =\det(I+AB) $$

and

$$ \det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \det\begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I&B\\\\0&I\end{pmatrix} \begin{pmatrix}I&-B\\\\A&I\end{pmatrix} =\det\begin{pmatrix}I+BA&0\\\\A&I\end{pmatrix} =\det(I+BA) $$

Since the leftmost members of these two equalities are equal, we get the equality you want.

  • 1
    Nice argument. I guess this is very close to the "Schur decomposition" method suggested by Professor Dubuque.2011-01-17
  • 1
    @Bruce, for the non-square situation you can argue similarly to the first part by using the fact that surjections $\mathbb R^n\to\mathbb R^m$, when $n\geq m$, are dense in the set of all matrices.2011-01-17
  • 0
    @MarianoSuárez-Álvarez: Are you sure about the surjection argument? We still can't make sense of $\det A$ when $A$ is not square.2018-11-19
6

We will calculate $\det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix}$ in two different ways. We have $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m & 0 \\ B & I_n + BA \end{pmatrix} = \det(I_n + BA). $$ On the other hand, $$ \det\begin{pmatrix} I_m & -A \\ B & I_n \end{pmatrix} = \det\begin{pmatrix} I_m+AB & 0 \\ B & I_n \end{pmatrix} = \det(I_m + AB). $$

  • 1
    Maybe it's useful to show *why* you can do like that, that is, by multiplying on the right or on the left by the determinant $1$ matrix $\begin{pmatrix}I_m & A\\ 0 & I_n\end{pmatrix}$.2017-01-05
  • 0
    What exactly is that determinant? Its elements are matrices itself?2017-01-05
  • 0
    As I mentioned in [my post](http://math.stackexchange.com/a/17837/242) in the lnked dupe, this is a special case of Schur decomposition.2017-01-05
2

here is another proof of $det(1 + AB) = det(1+BA).$ We will use the fact that the nonzero eigen values of $AB$ and $BA$ are the same and the determinant of a matrix is product of its eigenvalues. Take an eigenvalue $\lambda \neq 0$ of $AB$ and the coresponding eigenvector $x \neq 0.$ It is claimed that $y = Bx$ is an eigenvector of $BA$ corresponding to the same eignevalue $\lambda.$
For $ABx = Ay = \lambda x \neq 0,$ therefore $y \neq 0.$ Now we compute $BAy = B(ABx) = B(\lambda x) = \lambda y.$ We are done with the proof.