4
$\begingroup$

We have two $2\times 2$ matrices. The first matrix is $A=\begin{bmatrix} a & b \\ c & d \end{bmatrix}$, and the second matrix is obtained from the first: $B=\begin{bmatrix} 0 & 1 \\ -\det(A) & \operatorname{tr}(A) \end{bmatrix}$, or in other words $B=\begin{bmatrix} 0 & 1 \\ bc-ad & a+d \end{bmatrix}$.

The characteristic polynomial of $A$ is $\lambda^2 - \operatorname{tr}(A) \lambda + \det(A)$.
The characteristic polynomial of $B$ is the same, i.e. $\det(A-\lambda E) = \det(B-\lambda E)$.

Since $\det(A-\lambda E) = \det(B-\lambda E)$, then $A$ is similar to $B$, if $A$ isn't equal to $\lambda E$. Prove why, if $\det(A-\lambda E) = \det(B-\lambda E)$, then $A$ is similar to $B$..?

Thank you.

  • 3
    Take $A=0$, *i.e*, $A$ is the zero two by two matrix: $$A=\begin{bmatrix}0&0\\ 0&0\end{bmatrix}.$$2011-12-23
  • 0
    @Pierre-YvesGaillard I didn't understand you.2011-12-23
  • 0
    Dear @brainail: then we have $$B=\begin{bmatrix}0&1\\ 0&0\end{bmatrix}.$$ Are $A$ and $B$ similar?2011-12-23
  • 0
    @Pierre-YvesGaillard $A$ isn't equal to $\lambda E$.2011-12-23
  • 0
    @brainail: Sorry, I missed that... Then take a basis of the form $v,Av$.2011-12-23
  • 0
    @Pierre-YvesGaillard Please, tell a little more...2011-12-23
  • 1
    @brainail: Over a field, if two square matrices are such that the **four** characteristic and minimal polynomials coincide, then they are similar. In fact they are similar to the companion matrix of the polynomial.2011-12-23
  • 1
    @brainail: Less bad wording: Over a field, a square matrix is similar to the companion matrix of its characteristic polynomial iff the characteristic and minimal polynomials coincide. (See this [MO answer](http://mathoverflow.net/questions/65796/when-animals-attack/81588#81588)).2011-12-23
  • 0
    @Pierre-YvesGaillard Thank you.2011-12-23
  • 0
    @brainail: You're welcome! +1 for your nice question!2011-12-23
  • 0
    @Pierre-YvesGaillard If you want a nice question from me, right here: http://math.stackexchange.com/questions/92962/suppose-a-in-mathbbr-and-exists-n-in-mathbbn-that-an-in-mathb :)) If you come up with a simple proof, it will be great!2011-12-23
  • 0
    @brainail: Thanks, but I'm afraid your question is much to hard for me!2011-12-23

2 Answers 2

6

Let $A$ be an $n$ by $n$ matrix with entries in a field $K$. Consider the conditions

$(1)$ $A$ is similar to the companion matrix of its characteristic polynomial,

$(2)$ the minimal and characteristic polynomials of $A$ coincide,

$(3)$ there is a vector in $K^n$ whose annihilator in $K[X]$ has degree $n$.

Claim: these conditions are equivalent.

We are asked to prove the equivalence of $(1)$ and $(2)$ for $n=2$.

The following implications are clear for any $n$ $$ (1)\ \iff\ (3)\ \implies (2). $$

The proof that $(2)$ implies $(3)$ is simpler when $n=2$. Indeed, assuming that $n=2$ and that $(2)$ holds, we must show that there is a nonzero vector in $K^2$ which is not an eigenvector. But otherwise $A$ would have two distinct eigenvalues in $K$, and $K^2$ would be the union of two lines.

Here is a proof that $(2)$ implies $(3)$ when $n\ge3$ (proof which was not required).

By the Chinese Remainder Theorem we can assume that the minimal and characteristic polynomials of $A$ are equal to $p^m$, with $p$ irreducible and $m\ge1$. Then any vector of $K^n$ not annihilated by $p^{m-1}$ will have $p^m$ as its annihilator.

For the sake of completeness, here is a statement and a proof of the Chinese Remainder Theorem.

Let $R$ be a commutative ring and $\mathfrak a_1,\dots,\mathfrak a_n$ ideals such that $\mathfrak a_i+\mathfrak a_j=R$ for $i\not=j$. Then the natural morphism from $R$ to the product of the $R/\mathfrak a_i$ is surjective. Moreover the intersection of the $\mathfrak a_i$ coincides with their product.

Proof. We claim $$ R=\mathfrak a_1+\mathfrak a_2\cdots\mathfrak a_n.\tag4 $$ This can be checked either by multiplying together the equalities $R=\mathfrak a_1+\mathfrak a_i$ for $i=2,\dots,n$, or by noting that a prime ideal containing a product of ideals contains one of the factors. Then $(4)$ implies the existence of an $a_1$ in $R$ such that $$ a_1\equiv1\bmod \mathfrak a_1,\quad a_1\equiv0\bmod \mathfrak a_i\ \forall\ i > 1. $$ Similarly we can find elements $a_i$ in $R$ such that $a_i\equiv\delta_{ij}\bmod \mathfrak a_j$ (Kronecker delta). This proves the first claim.

Let $\mathfrak a$ be the intersection of the $\mathfrak a_i$. Multiplying $(4)$ by $\mathfrak a$ we get $$ \mathfrak a= \mathfrak a_1\mathfrak a+ \mathfrak a\mathfrak a_2\cdots\mathfrak a_n\subset \mathfrak a_1\ (\mathfrak a_2\cap\cdots\cap\mathfrak a_n)\subset\mathfrak a. $$ This gives the second claim, directly for $n=2$, by induction for $n > 2$. QED

ASIDE. It is tempting to generalize results about endomorphisms of finite dimensional vector spaces to results about finitely generated torsion modules over Dedekind domains. Then the first question is: how do you generalize the notion of characteristic polynomial? The answer is obvious:

Let $D$ be a Dedekind domain, $G$ the Grothendieck group of the category $C$ of finitely generated torsion $D$-modules, and $H$ the group of fractional ideals of $D$. Then the map $D/\mathfrak a\mapsto\mathfrak a$ induces an isomorphism $\chi$ of $G$ onto $H$, mapping the elements of $G$ represented by objects of $C$ onto the integral ideals of $D$, prompting us to call $\chi([M])$ the characteristic ideal of $M$.

In particular, the equivalence between $(1)$, $(2)$ and $(3)$ remains true in this wider context.

EDIT. In the above Aside, the assumption that $D$ is a principal ideal domain has been relaxed to the assumption that $D$ is only a Dedekind domain.

  • 1
    This is an overkilling answer2011-12-23
  • 0
    Dear @Norbert: Thanks for your comment. I edited the answer.2011-12-25
6

for a matrix $A$ with minimal polynomial $x^2-({\rm tr}A)x+{\rm det}A$, A is similar to its "companion matrix" $$ \left( \begin{array}{cc} 0&-{\rm det}A\\ 1&{\rm tr}A\\ \end{array} \right) $$ however if $A$ is a constant times the identity, $A=\lambda I$, then $A$ cannot be similar to its companion matrix (because they do not have the same minimal polynomial for instance).

  • 2
    Note that $$\mathbf A=\begin{pmatrix}a&b\\c&d\end{pmatrix}=\begin{pmatrix}-\det\mathbf A&a\\0&c\end{pmatrix}\cdot\begin{pmatrix}0&1\\-\det\mathbf A&\mathrm{tr}\mathbf A\end{pmatrix}\cdot \begin{pmatrix}-\det\mathbf A&a\\0&c\end{pmatrix}^{-1}$$2011-12-23
  • 2
    Additionally, $$\mathbf A=\begin{pmatrix}a&b\\c&d\end{pmatrix}=\begin{pmatrix} 1&a\\0&c\end{pmatrix}\begin{pmatrix}0&-\det\mathbf A\\1&\mathrm{tr}\mathbf A\end{pmatrix}\begin{pmatrix} 1&a\\0&c\end{pmatrix}^{-1}$$2011-12-23
  • 0
    @J.M. If A = λI ?2011-12-23
  • 0
    @brainail: that's a question that you should try answering yourself. :) In particular, would $\begin{pmatrix}-\det\mathbf A&a\\0&c\end{pmatrix}$ be invertible in that case?2011-12-23
  • 0
    @J.M. yep, you are right. thank you :)2011-12-23