4
$\begingroup$

I am trying to learn some linear algebra, and currently I am having a difficulty time grasping some of the concepts. I have this problem I found that I have no idea how to start.

Assume that $\bf A$ is an $n\times n$ complex matrix which has a cyclic vector. Prove that if $\bf B$ is an $n\times n$ complex matrix that commutes with $\bf A$, then ${\bf B}=p({\bf A})$ for some polynomial $p$.

All I know at this point is that ${\bf AB}={\bf BA}$.

  • 0
    @Beth, I give some links at the other answer, and mention at least one book. Any part of the equivalence is rather long for a website answer, but I give an easy example where an eigenvalue in two separate Jordan blocks makes a specific problem.2012-08-04

2 Answers 2

7

Since $A^0v,A^1v,\dots,A^{n-1}v$ are linearly independent, they form a basis for ${\bf C}^n$. Thus, $Bv=c_0A^0v+c_1A^1v+\cdots+c_{n-1}A^{n-1}v=p(A)v$ where $p(x)=c_0+c_1x+\cdots+c_{n-1}x^{n-1}$ for some constants $c_0,c_1,\dots,c_{n-1}$. Since $B$ commutes with $A$, it commutes with all powers of $A$, so $B(A^rv)=A^rBv=A^rp(A)v=p(A)(A^rv)$ for $r=0,1,\dots,n-1$ (I've used $A^rp(A)=p(A)A^r$). But again the vectors $A^rv$ are a basis, so $B=p(A)$ ($Bx=Cx$ for all $x$ in a basis implies $Bx=Cx$ for all $x$ in the vector space, which implies $B=C$).

  • 0
    Nice solution...2012-08-04
1

see my answer at Given a matrix, is there always another matrix which commutes with it? and
cyclic vectors- cyclic subspaces

I will see what else there is here on cyclic vectors. Here is all that is needed. As Gerry points out, if you are a beginner at this it is hard to understand why you are asking about this topic, but what the hell: http://planetmath.org/?op=getobj&id=5690&from=objects