0
$\begingroup$

I am having a hard time trying to prove the following:

Let $M$ be the space of all $2\times2$ complex matrices,

which is skew-hermitian ( i.e. $\bar{X}^t = -X$ ).

We consider M as a vector space over R,

For any $A\in M$,

define the operator $\text{ad}_A : M\to M $ by $\text{ad}_A (X) = AX – XA$.

Show that $\text{ad}_A$ is diagonalizable.

Let $A=\left[\begin{array}{cc}i & 1\\-1 & i\end{array}\right].$ How should I compute the eigenvalues of the operator $\text{ad}_A$ ?

(Maybe we should show there is a basis of $M$ consisting of eigenvectors of $\text{ad}_A$ ?)

Thanks.

  • 0
    It is not worth talking to you about this2012-11-09

2 Answers 2

3

There are a few facts to use, here.

Fact 1: Entrywise complex conjugation of matrices respects addition and multiplication of matrices. That is, $\overline{A+B}=\overline{A}+\overline{B}$ and $\overline{C\cdot D}=\overline{C}\cdot\overline{D}$ for any compatibly-dimensioned matrices $A,B,C,D$.

Fact 2: $(A+B)^t=A^t+B^t$ and $(CD)^t=D^tC^t$ for any compatibly-dimensioned matrices $A,B,C,D$.

Fact 3: Skew-hermitian matrices are diagonalizable.

Given that $A,X$ are skew-hermitian $2\times 2$ matrices, how can we use the first two facts to rewrite $\overline{\text{ad}_A(X)}\,{}^t$? That should let us use the third fact to get the desired conclusion.


As for your particular example, remember that your eigenvectors will be matrices. You could look for $4$ linearly independent skew-hermitian eigenvectors of $\text{ad}_A$, but that might be kind of obnoxious, and we're really only interested in the eigenvalues, anyway.

I recommend you start with a general matrix $X=\left[\begin{array}{cc}a+bi&c+di\\u+vi&x+yi\end{array}\right].$ Now, for $X$ to be skew-hermitian, it is necessary and sufficient that $a=x=0,$ $v=d$, and $u=-c.$ (Why?) Thus, a general skew-hermitian matrix $X$ has the form $X=\left[\begin{array}{cc}bi&c+di\\-c+di&yi\end{array}\right].$ Since there are $4$ free real parameters, then $M$ has dimension at most $4$ as a vector space over $\Bbb R$. In particular, letting $V_1=\left[\begin{array}{cc}i&0\\0&0\end{array}\right],\quad V_2=\left[\begin{array}{cc}0&1\\-1&0\end{array}\right],\quad V_3=\left[\begin{array}{cc}0&i\\i&0\end{array}\right],\quad V_4=\left[\begin{array}{cc}0&0\\0&i\end{array}\right],$ we find that $M$ has $\{V_1,V_2,V_3,V_4\}$ as a basis. Our general $2\times2$ skew-hermitian matrix can then be uniquely written in the form $X=bV_1+cV_2+dV_3+yV_4$, and it can be determined through calculation that with $X$ as above, $\text{ad}_A(X)=2dV_1+(y-b)V_3-2dV_4.$ Consider then the equivalent linear transformation $T:\Bbb R^4\to\Bbb R^4$ given by $T(b,c,d,y)=(2d,0,y-b,-2d)$. It will suffice to find the eigenvalues of $T$.

0

The eigenvalues of your matrix $A$ are $0$ and $2i$. You can then compute that the eigenvalues of $ad_A$ are $0$ (with multiplicity $2$), $2i$ and $-2i$. The easiest way to see this is to uses an orthonormal basis such that the matrix $A$ is transformed to the diagonal matrix with diagonal entries $0$ and $2 i$---the computations are then quite simple, so simple that it is not much effort to get the general result which is as follows: if $A$ is a skew hermitian $n\times n$ matrix with eigenvalues $(i \lambda_1,\dots ,i\lambda_n)$, then the eigenvalues of $ad_A$ as defined in the question are the numbers of the form $i(\lambda_i-\lambda_j)$ for all pairs $i$ and $j$ of indices (not necessarily distinct---we are working on a real vector space of dimensional $n^2$).