2
$\begingroup$

For $A \in \mathbb{C}^{n,n}$ and $\{ \lambda_1, \dots , \lambda_r\}$ are the eigenvalues of $A$.

My lecture notes say that the minimal polynomial of $A$ is $\prod_{i=1}^r(x-\lambda_i)^{a_i}$ where $a_i$ is the largest among the degrees of the Jordan blocks of $A$ of eigenvalue $\lambda_i$.

I can't get my head around how this is the case? Can someone try and explain this to me?

  • 0
    Both directions really2012-04-24

1 Answers 1

3

The rough outline of the proof is:

  1. If $A$ and $T$ are square matrices, and $T$ is invertible, then the minimal polynomial for $A$ is the same as the minimal polynomial for $TAT^{-1}$.

  2. If $B$ is a square matrix composed of a series of square matrices, $B_1,...,B_k$ along the diagonal and zero elsewhere, then the minimal polynomial for $B$ is the least common multiple of the minimal polynomials for the $B_i$.

  3. If $J_m(\lambda)$ is an $m\times m$ Jordan block - $\lambda$ on the diagonal and $1$ just above the diagonal - then the minimal polynomial for $J_m(\lambda)$ is $p(x)=(x-\lambda)^m$.

So, for any $A$, you can find a $T$ so that $B=TAT^{-1}$ is in Jordan Normal Form. We know that the minimal polynomial for $B$ is the least common multiple of the Jordan blocks of $B$. And we know the minimal polynomials for the Jordan blocks, which yields the result.

(1) is easy to prove.

(2) is easy if you realize that if $B$ is of this form, and $p$ is any polynomial, then $p(B)$ is composed of $p(B_1),...,p(B_k)$ along the dialogal and zero elsewhere.

(3) Takes a little bit of arithmetic. First prove it for $\lambda=0$.