3
$\begingroup$

Let $M\in M_n(F)$ and define $\phi:M_n(F)\to M_n(F)$ by $\phi(X)=AX$ for all $X\in M_n(F)$. Prove that $\det(\phi)=\det(A)^n$.

I can prove it by considering the matrix representation with respect to the usual basis, which turns out to be a block diagonal form consisting of $n$ copies of $A$. Nevertheless, I'm looking for a clean (basis-free) approach to this problem.

  • 0
    @QiaochuYuan: Thanks!2012-06-07

2 Answers 2

2

Think of $M_n(F)$ as $V \otimes V^{\ast}$ where $V = F^n$ and $V^{\ast}$ is the dual space. The multiplication map $M_n(F) \times M_n(F) \to M_n(F)$ is a map $(V \otimes V^{\ast}) \otimes (V \otimes V^{\ast}) \to V \otimes V^{\ast}$

which, as it turns out, can be identified with tensor contraction of the two middle terms. It follows that $M_n(F)$, as a left $M_n(F)$-module, can be identified with the tensor product of $V$ (with the natural structure of an $M_n(F)$-module) and $V^{\ast}$ (regarded just as a vector space, the "multiplicity space" of $V$), which is a coordinate-free way of saying that it consists of a direct sum of $n$ copies of $V$ (which in turn is a reformulation of your statement about block diagonals).

Some module theory will put this result in useful context. The above argument shows that $M_n(F)$ is semisimple as a module over itself, hence that it is a semisimple ring, which means any module over $M_n(F)$ is a direct sum of simple modules; moreover, the above argument shows that $F^n$ is the unique simple module of $M_n(F)$ (since any simple module of a ring $R$ appears as a quotient of $R$ regarded as a module over itself), so in fact any module over $M_n(F)$ whatsoever breaks up into a direct sum of copies of $F^n$. This reflects the fact that $M_n(F)$ is Morita equivalent to $F$.

  • 0
    A simpler but slightly more coordinate-dependent way of stating the above is the following. Let $V = F^n$ and $W$ be another vector space. Consider the composition map $\text{Hom}(W, V) \times \text{Hom}(V, V) \to \text{Hom}(W, V)$. This map gives $\text{Hom}(W, V)$ the structure of an $\text{End}(V)$-module (left, even though $\text{Hom}(V, V)$ is on the right; this is due to bad function composition notation) and I claim that this structure respects direct sums in $W$. By choosing a direct sum decomposition of $W = V$ into one-dimensional subspaces the result follows.2012-06-07
2

The eigenvalue equation is $\phi(X)=\lambda_k X\implies AX=\lambda_k X$, so the columns of $X$ are the eigenvectors of $A$ corresponding to the eigenvalue $\lambda_k$. The operator $A$ has n eigenvalues counting multiplicity, and the multiplicity of $\lambda_k$ corresponding to $\phi(X)$ equation is $n$ times the the multiplicity of $\lambda_k$ corresponding to $A$. This is because having one column as an eigenvector for $A$ and the rest the zero is an eigenbasis corresponding to $\lambda_k$, and there are n times the multiplicity of $\lambda_k$ of these vectors in the basis since there are n columns. So the multiplicity of each $\lambda_k$ corresponding to $\phi$ (not counting multiplicities corresponding to $A$) is $n$, so that $\det \phi=\lambda_1^n\cdots\lambda_n^n=\det A^n.$ I hope this is understandable, if not, ask.

  • 0
    Well if $A$ is not diagonalizable, for one thing what "multiplicity of $\lambda_k$ of these vectors" are you talking about: dimension of the eigenspace (i.e., the set of these vectors) or multiplicity as root of the characteristic polynomial? Also "having ... is an eigenbasis corresponding to $\lambda_k$" isn't clear; basis of what space? Just a basis for the eigenspace won't suffice for getting the right exponent for use in the determinant.2012-06-07