Hint $\ $ Work universally, i.e. consider the matrix entries as indeterminates $\rm\,a_{\,i\,j},b_{\,i\,j}.\,$ Adjoin them all to $\,\Bbb Z\,$ to get the polynomial ring $\rm\ R = \mathbb Z[a_{\,i\,j},b_{\,i\,j}\,].\, $ Now, $ $ in $\rm\,R,\,$ compute the determinant of $\rm\ (1+A B)\, A = A\, (1+BA)\ $ then cancel $\rm\ det(A)\ \ $ (which is valid because $\,\rm R\,$ is a domain). $\ \ $ Extend to non-square matrices by padding appropriately with $0$'s and $1$'s to get square matrices. Note that the proof is purely algebraic - it does not require any topological notions (e.g. density).
Alternatively, one may proceed by way of Schur decomposition, namely
$\rm\left[ \begin{array}{ccc} 1 & \rm A \\ \rm B & 1 \end{array} \right]\ =\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm 0 & \rm 1-BA \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]$
$\rm\phantom{\left[ \begin{array}{ccc} 1 & \rm B \\ \rm A & 1 \end{array} \right]}\ =\ \left[ \begin{array}{ccc} 1 & \rm A \\ \rm 0 & 1 \end{array} \right]\ \left[ \begin{array}{ccc} \rm 1-AB & \rm 0 \\ \rm 0 & \rm 1 \end{array} \right]\ \left[ \begin{array}{ccc} 1 & \rm 0 \\ \rm B & 1 \end{array} \right]$
See this answer for more on universality of polynomial identities and relation topics, and see also this sci.math thread on 9 Nov 2007.