Let $A$ be an $n\times n$ matrix and $U$ an invertible $n\times n$ matrix, both with coefficients in $\mathbb R$, and suppose that $ UAU^{-1}=cA $ for some $ c \in \mathbb R,c \neq 0, \pm 1$. How can we prove that $ A^n=0 $?
Proving that $ A^n=0 $ if $UAU^{-1}=cA$?
-
0Note that the hypothesis implies $tr(A)=c\cdot tr(A)$, and generally, $tr(A^k)=c^k\cdot tr(A^k)$. Since $c$ is non-zero, and not a root of unity, we must have $tr(A^k)=0$ for all $k$. This implies the characteristic polynomial of $A$ is $x^n$. – 2012-02-26
-
1For more interest: (1) $A$ is an operator on infinite-dimensional Banach space, or (2) a matrix, but scalars other than $\mathbb R$. – 2012-02-26
2 Answers
$UAU^{-1}$ and $A$ have the same eigenvalues. IF $\lambda$ is a nonzero eigenvalue of $A$, then $c\lambda$ is an eigenvalue of $cA$, hence an eigenvalue of $A$. Iterating this process, $c^n\lambda$ is an eigenvalue for every $n$, but since $c\neq 0, 1, -1$, we would have infinitely many eigenvalues, a contradiction.
-
0This is exactly the same as the first part of my answer. But you leave out the reason for the $n$th power to be $0$ (Cayley-Hamilton). – 2012-02-25
-
0Agreed. I think we posted within a few seconds of eachother. I voted yours up. – 2012-02-26
The eigenvalues of $cA$ are $c$ times those of $A$. Since $A$ and $cA$ are conjugate, they have the same eigenvalues. Thus the map $\lambda \mapsto c\lambda$ is a permutation of the eigenvalues of $A$. Applying this permutation repeatedly it follows that for some integer $m$, $c^m=1$ if $A$ has a non-zero eigenvalue, contradicting your choice of $c$ (btw, $c \neq 0$ is obviously not necessary, and I am assuming that $R=\mathbb{R}$ is the real numbers). Hence all eigenvalues of $A$ are zero---i.e. $A$ is nilpotent. Since it is $n$ by $n$ the $n$th power of $A$ must already be zero (Cayley-Hamilton).
-
0May be trivial question: "Since $A$ and $cA$ are conjugate". Why is it so? – 2012-02-25
-
0In general matrix A doesn't have eigenvalues – 2012-02-25
-
0The hypothesis is $U A U^{-1}=cA$; this is the definition of conjugacy of matrices. – 2012-02-25
-
1Every matrix has an eigenvalue---it may be complex, but that doesn't affect the argument: work over the complex numbers. – 2012-02-25
-
1But we can always view $A$ as a matrix in the algebraic closure. – 2012-02-25
-
0@Steve thanks. I used to call it *similar* matrix. Now I saw the definition on Wikipedia. – 2012-02-25