8
$\begingroup$

Prove $L \otimes M = M \otimes L$ only if either $L=0$ or $M=0$

I saw this statement on Linear Algebra (2ed) written by Hoffman and Kunze. I can't figure out how to prove it. The multilinear forms $L$ (and $M$) are from $V^r$ (and $V^s$) into $K$ where $K$ is a commutative ring with identity and $V$ is a $K$-module. I think it is reasonable to exclude $L=M$.

Thanks.

Added by the crowd. Here's the relevant excerpt.

enter image description here enter image description here

  • 0
    @vesszabo Actually whether $r=s$ or not doesn't matter. Check Jonas Meyer's comments. The statement is wrong.2012-07-09

2 Answers 2

6

After such a long chain of (old) comments, a summary is in order. By definition, $L\otimes M$ is the function on $V^{r+s}$ such that $(L\otimes M)(\alpha_1,\dots,\alpha_{r+s})=L(\alpha_1,\dots,\alpha_r)\,M(\alpha_{r+1},\dots,\alpha_{r+s}) \tag1$ Similarly, $(M\otimes L)(\alpha_1,\dots,\alpha_{r+s})=M(\alpha_1,\dots,\alpha_s)\,L(\alpha_{s+1},\dots,\alpha_{s+r}) \tag2$ The following is a sufficient condition for the functions defined by (1) and (2) to be equal, pointed out in comments by Jonas Meyer.

$(*)\qquad$ $L$ and $M$ are scalar multiples of some tensor powers of a multilinear function $N$.

Since $(*)$ does not require either $L$ or $M$ to vanish, the statement in the book is false.

It hasn't been resolved whether $(*)$ is also a necessary condition for $L\otimes M=M\otimes L$.

  • 0
    I've subs$t$antially revised my answer.2013-07-05
0

Consider each linear function as a column vector in the dual space with some basis. Then the tensor product of $n$ such vectors can be visualized as a hyper matrix (or tensor, as it's usually described in physics) of dimension $n$, where each entry is the product of entries in each column vector. Two vectors $a,b$ tensored gives a matrix where each row is a multiple of the first vector, and the multiple of each row is given by the entries in the second column vector.

Tensoring such a matrix with a third tensor gives a cubical matrix where each horizontal slice is a multiple of the base matrix, and the multiples are given by the third column vector, etc.

Note that every single row (i.e. 1-dimensional subset) in the $i$th direction of the tensor hyper matrix is a multiple of the $i$th vector in the tensor product. Either every such row is zero (and the tensor is zero) or we can recover the $i$th vector in the tensor product up to a scalar factor.

This means that in non-zero tensor products, the multiplicands and their order are uniquely determined up to scalar products. Switching the order of tensoring as described in the problem is equivalent to taking an $r+s$ dimensional tensor permuting the order of the multiplicands by a cyclic permutation (I.e. shifting every index by $s$ and modding indices by $r+s$). If the tensor is invariant under such a cyclic permutation, then the multiplicands of the product tensor must be, up to a scalar factor,periodic with period $gcd(s,r+s)=gcd(s,r)$, i.e. the 2 vectors being tensored must be scalar multiples of tensor powers of the same tensor.