1
$\begingroup$

I'm clearly lacking insight on the following problem...

So let $C$ be a $2\times2$ matrix with entries in some field. I want to know when I can find $A, B$ (also $2\times2$) such that $C=AB-BA$. In particular, you should be able to find them iff ${\rm Tr}(C)=0$ right? I'm having trouble with the "given ${\rm Tr}(C)=0$, prove that $A, B$" exist direction.

Methinks the problem is intended to be done with minimal background...i.e. knowing what a determinant is is less than kosher (nor should I know that $AB-BA$ is $[A,B]$).

  • 0
    This is proved in more generality at http://math.stackexchange.com/questions/181430. Can this be closed as a duplicate of that, or is that not sufficiently minimal background? (I don't know what you mean by knowing that $AB-BA$ is $[A,B]$ -- isn't that just the definition of $[A,B]$?)2012-08-31
  • 0
    @joriki, basically, I shouldn't know what a Lie Bracket is...I also don't understand the answer to the other one...failure abounds! I'm not supposed to know what a subspace is either I think, so thinking of orthogonal complements to a subspace is also out I think...2012-08-31
  • 2
    @joriki, it seems to me that the question you reference wags the dog on this; it (sort of) assumes the the sum of commutators is a commutator, which is far from obvious (at least to me).2012-08-31

4 Answers 4

0

This isn't the answer you are looking for, but it is stronger than the other answers so far. I.e. showing that given an nxn matrix, you can express it as the sum of $2n-1$ commutators.

You can show that given an nxn matrix $A$, it is the sum of 2 commutators. Let $A = D + N$, where $D$ is diagonal with trace zero and $N$ has zeros on its diagonal. You can then represent each of $D$, $N$ as a commutator. (This takes a little thought, but not too hard, I can give details if you need them)

2

In addition to the question joriki linked to, there exists several more references on the subject. Notably this paper seems to be quite elementary, with the main focus of the proof being a clever lemma. Another reference is this, but it uses the rational canonical form which may be too much.

2

Let $$C=\pmatrix{r&s\cr t&-r\cr}$$ Let $$A=\pmatrix{a&b\cr c&d\cr}\qquad B=\pmatrix{x&y\cr z&w\cr}$$ In a deleted answer, NKS did the hard work of calculating $$ AB-BA = \left( \begin{array}{cc} bz - cy & (a-d)y + b(w-x)\\(d-a)z + c(w-x)&cy-bz \end{array}\right) $$ Assuming that calculation is correct (I haven't checked it), let $b=1,z=r,c=y=0$, then all we need is $w-x=s,(d-a)r=t$. This is fine unless $r=0$ and $t\ne0$. If $r=0$, let $b=z=c=y=1$, and $a-d+w-x=s,d-a+w-x=t$.

EDIT: Having checked the calculations, I think it should be $$ AB-BA = \left( \begin{array}{cc} bz - cy & (a-d)y + b(w-x)\\(d-a)z + c(x-w)&cy-bz \end{array}\right) $$ If $r\ne0$ then we can take $b=1,z=r,c=y=0$ and take $a,d,w,x$ to satisfy $w-x=s,d-a=t/r$. If $r=0$, we can take $c=z=t,b=y=-s$ and take $a,d,w,x$ to satisfy $d+x-a-w=1$.

1

In your particular case it is particularly simple:

1) Prove that $\,T_0:=\{A\in M_2(\Bbb F)\;\;|\;\;tr. A=0\}\,$ is a subspace of dimension 3 in $\,M_2(\Bbb F)\,$ (hint: $\,T_0=\ker tr.\,$, and since $\,trace=tr.\,$ is a non-zero linear functional its kernel is a hyperplane=a proper subspace of maximal dimension)

2) Show that $\,M':=\operatorname{Span}\,\{\,[A,B]:=AB-BA\;\;|\;\;A,B\in M_2(\Bbb F)\,\}\,$ is a vector subspace of dimension 3 (hint: $$A=\begin{pmatrix}a&b\\c&d\end{pmatrix}\,,\,B=\begin{pmatrix}w&x\\y&z\end{pmatrix}\Longrightarrow AB-BA=\begin{pmatrix}k&**\\*&-k\end{pmatrix}$$ with $\,k\,,\,*\,,\,**\in\Bbb F\,$ , so using this show the following is a basis for $\,M'\,$: $$\left\{\begin{pmatrix}0&1\\0&0\end{pmatrix}\,\,,\,\,\begin{pmatrix}0&0\\1&0\end{pmatrix}\,\,,\,\,\begin{pmatrix}1&0\\0&-1\end{pmatrix}\right\}$$

3) Finally, it's easy to see that $\,M'\leq T_0\,$ , so...

  • 0
    This doesn't actually answer the question; the question is whether or not every traceless matrix is the commutator of two matrices. The missing ingredient is showing that the sum of commutators is a commutator (since scalar multiplication is trivial), and thus that the span of commutators = set of commutators.2012-08-31
  • 0
    So M' is a subspace of T_0 of the same dimension, so the spaces are the same? So I see how your basis works for T_0, but I'm not sure why I should believe it works for M'? How do I know * and ** can be any element of F?2012-08-31
  • 1
    @AsinglePANCAKE, let me call the basis elements $e,f,h$ respectively (so $h$ is the diagonal one). Then check that $[e,f]=h$, $[h,e]=2e$ and $[h,f]=-2f$, so those elements are indeed in M'. This is the standard basis for traceless 2 by 2 matrices viewed as a Lie algebra, by the way.2012-08-31
  • 0
    Oh, just open up stuff and check. For example, with my choice of matrices $\,A,B\,$ ,we have that $$*=(a-d)y+(z-w)c\,\,,\,\,**=(a-d)x+(z-w)b$$ and try to form a 2x2 linear system in $\,a-d\,\,,\,\,z-w\,\,$ , or the other way around...2012-08-31
  • 0
    @user1306 , unless I made some mistake my answer *precisely* answers the OP: it shows the traceless matrices are exactly the commutator ones, and no ingredient lacks: I took *the span* in point (2).2012-08-31
  • 2
    @DonAntonio, that was my point: you showed the subspace GENERATED by matrices which are commutators is equal to the subspace of traceless matrices. In other words, "If the trace is zero, the matrix is a linear combination of commutators of matrices". The OP's question is how to prove the stronger statement "If the trace is 0, show it is the commutator of two matrices".2012-08-31
  • 0
    I see your point now, @user1306...well, going into Lie algebras could do, maybe, the trick, but I can't see right now how.2012-08-31