1
$\begingroup$

There is a lemma in the second addition of Artin's Algebra (Chapter 10 on Representation Theory) used to prove the orthogonality relations for irreducible characters:

$\textbf{Lemma :}$ Let $T$ be an operator on the space $\mathbb{C}^{m \times n}$ of $m \times n$ complex matrices, defined as

$T(M) = AMB$, where $A$ is some $m \times m$ complex matrix and $B$ some $n \times n$ complex matrix.

Then trace $T =$ trace$(A)$trace$(B)$.

One can construct an "eigenmatrix" (Is this terminology standard?) of $T$ from an eigenvector $x$ of $A$ and $y$ of $B^T$, namely the matrix $xy^T$. Consequently if $x$ was an eigenvector of eigenvalue $\lambda$ and $y$ of eigenvalue $k$, then the eigevalue of the eigenmatrix $xy^T$ is $\lambda k$.

Now to find the trace of $T$, one needs all its eigenvalues. A sufficient but perhaps not necessary condition would be if $T$ has $mn$ distinct eigenvalues. If it does then the lemma in Artin above follows immediately.

However the question that arises is if $T$ does not have $mn$ distinct eigenvalues. Artin does mention of constructing a sequence of matrices $A_k \rightarrow A$ and $B_k \rightarrow B$ such that each of these have distinct eigenvalues and the product of their eigenvalues is distinct for all $k$. He the proceeds to say that the lemma follows by continuity.

I find this argument not rigorous - how can one choose a sequence matrices $A_k$ and $B_k$ like that? Can it made to be rigorous? Someone said to me that one can prove this lemma by using Jordan normal form. Does anyone know of a proof using Jordan form?

$\textbf{ Update :}$ This lemma is more easily proven by bashing out the algebra as shown in Robert and Hans's answers below. However, how about if we were to ask of computing the eigenvalues of $T$?

Thanks.

2 Answers 2

1

Not that this really answers your questions, but why not simply prove the lemma by computation?

If $E^{(ij)}$ is the matrix with a $1$ at position $(i,j)$ and zeros elsewhere, the "$(ab,ij)$" matrix entry when representing the operator $T$ in the basis $\{ E^{(ij)}\}$ is $ (A E^{(ij)} B)_{ab} = \sum_{k,l} A_{ak} E^{(ij)}_{kl} B_{lb} = A_{ai} B_{jb}. $ Summing the diagonal entries "$(ij,ij)$" gives $ \operatorname{Tr} T = \sum_{i,j} (AE^{(ij)}B)_{ij} = \sum_{i,j} A_{ii} B_{jj} = \left( \sum_i A_{ii} \right) \left( \sum_j B_{jj} \right) = \operatorname{Tr} A \operatorname{Tr} B. $

  • 0
    @AmiteshDatta You mentioned to me that this argument can be made rigorous (the one we talked about continuity), apparently there is a flaw in the argument when one passes to the limit. Do you know how to find a condition that one is able to get *all* the eigenvalues of $T$?2011-10-22
1

Yes, it is easy to do using Jordan normal form, since for a matrix in Jordan form it is easy to perturb the eigenvalues by changing the diagonal elements slightly. But Jordan form is not available over every field, and I think this is easier to prove more directly. Note that both sides of the equation $\text{trace }T = (\text{trace }A)(\text{trace }B)$ are linear in $A$ and in $B$, so it suffices to prove for $A$ and $B$ in a suitable basis. So we can take $A = e_i e_j^T$ and $B = e_k e_l^T$ for $1 \le i,j \le m$ and $1 \le k,l \le n$, where $e_i$ is the vector (of appropriate size) with 1 in the $i$'th place and 0 everywhere else. For ${\mathbb C}^{m \times n}$ use the basis of elements $e_r e_s^T$. Then $ \text{trace} T = \sum_{r,s} ((e_i e_j^T e_r e_s^T e_k e_l^T))_{r,s} = \sum_{r,s} \delta_{ir} \delta_{jr} \delta_{sk} \delta_{ls} = \delta_{ij} \delta_{kl} = \text{trace}(e_i e_j^T) \text{trace}(e_k e_l^T) $

  • 0
    Do you how one can compute all the eigenvalues of such an operator $T$? Hans above has given an argument but I am not well versed in such techniques.2011-10-20