1
$\begingroup$

$A$ is a nonzero $n \times n$ matrix such that $A^2=0$. If $n=2$, is it possible that I can show that there exists an invertible $2 \times 2$ matrix $S$ such that $S^{-1}AS=\begin{bmatrix} 0 & 0\\ 1 & 0 \end{bmatrix}$?

My first approach to this was to rearrange the equation so that it appears as $A=S\begin{bmatrix} 0 & 0\\ 1 & 0 \end{bmatrix}S^{-1}$ but it is kind of weird because this would mean $\begin{bmatrix} 0 & 0\\ 1 & 0 \end{bmatrix}$ is the eigenvalue matrix. And this isn't a diagonal matrix.

I got no more idea and so I looked at the hint and it says I could let $\vec{u} \in \mathbb{R}^n$ such that $A\vec{u}\neq0$ and then make $S=\begin{bmatrix} \vec{u} & A\vec{u} \end{bmatrix}$ to continue from here. But I don't understand how can I just anyhow throw in a vector in $\mathbb{R}^n$ into the eigenspace matrix $S$ without even confirming that $\vec{u}$ is a eigenvector of matrix $A$?

  • 0
    It didn't say that $\vec{u}$ is an eigenvector of $A$ but since $S$ is the eigenspace of $A$, and by throwing another vector into $A$, aren't we assuming that the vector is also a eigenvector? Otherwise, what is the meaning of putting the extra $\vec{u}$ into the eigenspace matrix $S$? Wouldn't it alter the whole equation?2011-11-05

2 Answers 2

3

This is basically the same answer as given by Listing, but formulated in "different language" - you can choose what suits you best.

Choose any non-zero vector $\vec x$ such that $\vec x A=\vec y$ is non-zero. (Note that I am using row vectors.)

Not that $\vec y A = \vec x A^2=0$. This also implies that the vectors $\vec x$ and $\vec y$ are linearly independent.

Now we have $\begin{pmatrix} \vec y \\ \vec x \end{pmatrix} A = \begin{pmatrix} \vec 0 \\ \vec y\end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} \vec y \\ \vec x\end{pmatrix}.$ If we denote $P=\begin{pmatrix} \vec y \\ \vec x\end{pmatrix}$, then the matrix $P$ is regular and $PA=\begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}P$, i.e. $PAP^{-1}=\begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}$.

(Basically, we just asked what is the matrix of the linear map $A$ in the basis $\vec x$, $\vec y$.)

  • 0
    Where does it say that $P$ is an eigenspace matrix? Since $x$ is not an eigenvector it clearly isn't.2011-11-05
2

Let $A$ be such that $A^2=0$, then certainly $A$ can only have the eigenvalues $0$, and therefore, by the jordan normal-form, there exist $H$ invertible such that $A=H^{-1}\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}H$.

As $X=\begin{bmatrix} 0 & 0\\ 1 & 0 \end{bmatrix}$ has also only the eigenvalues $0$ there exist a $P$ invertible such that

$X=P^{-1}\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}P$

Now by transitivity of similar matrices your theorem is immediate, because $A$ is similar to $X$.

  • 0
    Thank you, I changed it to use the jordan-decomposition as you suggested.2011-11-05