56
$\begingroup$

First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an eigenvalue of $BA$?

If it's not true, then under what conditions is it true or not true?

If it is true, can anyone point me to a citation? I couldn't find it in a quick perusal of Horn & Johnson. I have seen a couple proofs that the characteristic polynomial of $AB$ is equal to the characteristic polynomial of $BA$, but none with any citations.

A trivial proof would be OK, but a citation is better.

  • 0
    Have you tried to find a counter example using basic 2x2 matrices?2012-03-27

4 Answers 4

96

If $v$ is an eigenvector of $AB$ for some nonzero $\lambda$, then $Bv\ne0$ and $\lambda Bv=B(ABv)=(BA)Bv,$ so $Bv$ is an eigenvector for $BA$ with the same eigenvalue. If $0$ is an eigenvalue of $AB$ then $0=\det(AB)=\det(A)\det(B)=\det(BA)$ so $0$ is also an eigenvalue of $BA$.

More generally, Jacobson's lemma in operator theory states that for any two bounded operators $A$ and $B$ acting on a Hilbert space $H$ (or more generally, for any two elements of a Banach algebra), the non-zero points of the spectrum of $AB$ coincide with those of the spectrum of $BA$.

  • 2
    @crf No -- the trouble is that when $Bv=0$, maybe $v$ is not in the range of $A$. The argument given for $\lambda\ne0$ works in Hilbert space, for example, but for $\lambda=0$ the result is not generally true there: On the infinite sequence space $\ell^2$, let $A$ be the right shift ($A(x_1,x_2,\ldots)=(0,x_1,x_2,\ldots)$, and $B$ the left shift. Then $AB(1,0,\ldots)=0$, but $BA$ is the identity.2013-05-07
18

It is true that the eigenvalues (counting multiplicity) of $AB$ are the same as those of $BA$.

This is a corollary of Theorem 1.3.22 in the second edition of "Matrix Analysis" by Horn and Johnson, which is Theorem 1.3.20 in the first edition.

Paraphrasing from the cited Theorem: If $A$ is an $m$ by $n$ matrix and $B$ is an $n$ by $m$ matrix with $n \geq m$ then the characteristic polynomial $p_{BA}$ of $BA$ is related to the characteristic polynomial $p_{AB}$ of $AB$ by $p_{BA}(t) = t^{n-m} p_{AB}(t).$

In your case, $n = m$, so $p_{BA} = p_{AB}$ and it follows that the eigenvalues (counting multiplicity) of $AB$ and $BA$ are the same.

You can see Horn and Johnson's proof in the Google Books link above. A similar proof was given in this answer from Maisam Hedyelloo.

16

Here is an alternative proof for this result, following Exercises 6.2.8-9 of Hoffman & Kunze's Linear Algebra (p. 190):


Lemma: Let $A,B\in M_n(\mathbb{F})$, where $\mathbb{F}$ is an arbitrary field. If $I-AB$ is invertible, then so is $I-BA$, and

$(I-BA)^{-1}=I+B(I-AB)^{-1}A.$

Proof of Lemma: Since $I-AB$ is invertible,

\begin{align} &I=(I-AB)(I-AB)^{-1}=(I-AB)^{-1}-AB(I-AB)^{-1}\\ &\implies (I-AB)^{-1} = I+ AB(I-AB)^{-1}. \end{align}

Then we have

\begin{align} I+B(I-AB)^{-1}A&= I+B[I+ AB(I-AB)^{-1}]A= I+BA+BAB(I-AB)^{-1}A\\ \implies I&=I+B(I-AB)^{-1}A-BA-BAB(I-AB)^{-1}A\\ &=I[I+B(I-AB)^{-1}A]-BA[I+B(I-AB)^{-1}A]\\ &=(I-BA)[I+B(I-AB)^{-1}A].\checkmark. \end{align}


Proposition: $\forall A,B\in M_n(\mathbb{F}):$ $AB$ and $BA$ have the same eigenvalues.

Proof: Let $\alpha\in\mathbb{F}$ be an eigenvalue of $AB$. If $\alpha=0$, then $0=\det(0I-AB)=\det(-A)\det(B)=\det(B)\det(-A)=\det(0I-BA)$ and so $0$ is an eigenvalue of $BA$ also.

Otherwise $\alpha\neq0$. Suppose $\alpha$ is not an eigenvalue of $BA$. Then $0\neq\det(\alpha I-BA)=\alpha^n\det(I-(\frac{1}{\alpha}B)A)$. Then $0\neq\det(I-(\frac{1}{\alpha}B)A),$ so that $I-(\frac{1}{\alpha}B)A$ is invertible. By the lemma above we know that $I-A(\frac{1}{\alpha}B)$ is invertible as well, meaning $0\neq\det(I-A(\frac{1}{\alpha}B))=\det(I-\frac{1}{\alpha}AB) \implies 0\neq\det(\alpha I-AB)$. But we assumed $\alpha$ to be an eigenvalue for $AB$, $\unicode{x21af}$.

  • 0
    The equation doesn't end there, and determinant is multilinear on columns.2017-10-12
7

Notice that $\lambda$ being an eigenvalue of $AB$ implies that $\det(AB-\lambda I)=0$ which implies that $\det(A^{-1})\det(AB-\lambda I)\det(B^{-1})=0=\det(A^{-1}(AB-\lambda I)B^{-1})=\det((B-\lambda A^{-1})B^{-1}) $ $$=\det(I-\lambda A^{-1}B^{-1}) = 0.$$ This further implies that $$\det(BA)\det(I-\lambda A^{-1}B^{-1})=\det(BA(I-\lambda A^{-1}B^{-1}))=\det(BA-\lambda I)=0,$$ i.e., $\lambda$ is an eigenvalue of $BA$. This proof holds only for invertible matrices $A$ and $B$ though. For singular matrices you can show that $0$ is a common eigenvalue, but I can't think of a way to show that the rest of the eigenvalues are equal.