8
$\begingroup$

Prove $L \otimes M = M \otimes L$ only if either $L=0$ or $M=0$

I saw this statement on Linear Algebra (2ed) written by Hoffman and Kunze. I can't figure out how to prove it. The multilinear forms $L$ (and $M$) are from $V^r$ (and $V^s$) into $K$ where $K$ is a commutative ring with identity and $V$ is a $K$-module. I think it is reasonable to exclude $L=M$.

Thanks.

Added by the crowd. Here's the relevant excerpt.

enter image description here enter image description here

  • 0
    When you say commutative, you mean that $L \otimes M$ and $M \otimes L$ are the same forms on $V^{r + s}$?2012-07-01
  • 0
    Exactly. The statement is $L \otimes M = M \otimes L$ only if either $L=0$ or $M=0$.2012-07-01
  • 1
    Is $V$ an arbitrary $K$-module or a finite free $K$-module? (In fact, I wonder if your $V$ should be $K$.)2012-07-01
  • 0
    Where is this statement in H–K, by the way? [Ah, found it: top of pg. 168.]2012-07-01
  • 0
    It's $V$ not $K$. Although it is not said explicitly near the statement in the book, I think right now we can suppose that $V$ is a finite free $K$-module.2012-07-01
  • 1
    What if I take $V = K$, $r = s = 1$, $L = M$ the identity map $K \to K$? Then both of the resulting maps $K \times K \to K$ are just multiplication. Am I crazy?2012-07-01
  • 0
    Actually I thought of similar situations too. I think we should eliminate the trivial situation that $L=M$.2012-07-01
  • 2
    I think excluding $L=M$ is a bad idea. One should either figure out what question they actually *meant* to ask (or possibly that they did ask and you got the context wrong), or acknowledge that the statement is simply *wrong*, and replace it with a correct statement.2012-07-01
  • 0
    @Hurkyl Rigorously the original statement in the book is wrong. But I think by excluding $L=M$ (which I guess is the authors' assumption), the statement could be correct.2012-07-01
  • 0
    Also: If $r\ne s$ then how are the two tensors both well-defined elements of the same tensor product?2012-07-01
  • 0
    @anon Maybe I misunderstand your comment. I think $r \neq s$ is not a problem. Both $L \otimes M$ and $M \otimes L$ are multilinear forms from $V^{r+s}$ into $K$. They could be the same.2012-07-01
  • 0
    You mean maps $V^r\times V^s\to K$ that are linear in the first and second arguments (i.e. an element of $V^r\otimes V^s$). More precisely, *pure* tensors which factor as a linear map of one argument times a linear map of the other. (Also, I too wonder if $V$ was supposed to be $K$.) For example, $L:x\mapsto x$ and $M:(a,b)\mapsto a+b$ are functionals on $K$ and $K^2$ respectively, and $L\otimes M:K\times K^2\to K$ is sensible, given by $(x,(a,b))\mapsto x(a+b)$. But how is $M\otimes L:K\times K^2\to K$ formed?2012-07-01
  • 0
    @MathFun Nevermind. H&K would be right in that case, since all the functionals would be zero.2012-07-01
  • 0
    @anon I don't see the issue. Can't we think of both of those as tri-linear maps out of $K^3$?2012-07-01
  • 0
    @anon First, again it definitely is $V$ (it would be better if you have the book). Second, in the book, $L \otimes M$ is defined as a function on $V^{r+s}$, which is defined as $V\times V \times \cdots \times V$. $( L \otimes M )(a_1,\ldots,a_{r+s})=L(a_1,\ldots,a_r)M(a_{r+1},\ldots,a_{r+s})$. So for your example, I think both $L \otimes M$ and $M \otimes L$ are defined as $K^3\to K$.2012-07-01
  • 0
    I see. This is different from how I've seen tensors defined.2012-07-01
  • 4
    There are definitely exceptions other than $L=M$. For example, $L=\lambda M$ for some scalar $\lambda$. Less trivially, if $L=M\otimes M$. What if we suppose that neither of $L$ or $M$ is a scalar multiple of a tensor power of the other?2012-07-01
  • 3
    We also have to worry about things like $L=N\otimes N$, $M=N\otimes N\otimes N$. What if we suppose that no tensor power of $L$ is a scalar multiple of a tensor power of $M$, and vice versa?2012-07-01
  • 1
    I added a scan of the paragraph that contains the claim, and the one just before it that defines $L \otimes M$. It might be better to just cook up an example in which, indeed, $L \otimes M \neq M \otimes L$ and then move on for now. [Also, I'm somewhat dismayed that an algebra book treats tensors in this way.]2012-07-01
  • 1
    To be fair, multilinear forms on a fixed vector space of interest is a rather important case. If that's all the text uses, a case could be made for a streamlined definition of this special case. e.g. associativity is an equation, not a natural isomorphism.2012-07-01
  • 0
    Thanks for all your help. Jonas Meyer's counterexamples are enough for me to leave this statement wrong.2012-07-02
  • 0
    @MathFun anon has right. I didn't read this book. However it can happen that authors, many pages before, assumed explicitly or implicitly that $r\neq s$. So please check it. In the special case r=s, of course the statement is not true2012-07-03
  • 0
    @vesszabo Actually whether $r=s$ or not doesn't matter. Check Jonas Meyer's comments. The statement is wrong.2012-07-09

2 Answers 2

6

After such a long chain of (old) comments, a summary is in order. By definition, $L\otimes M$ is the function on $V^{r+s}$ such that $$(L\otimes M)(\alpha_1,\dots,\alpha_{r+s})=L(\alpha_1,\dots,\alpha_r)\,M(\alpha_{r+1},\dots,\alpha_{r+s}) \tag1$$ Similarly, $$(M\otimes L)(\alpha_1,\dots,\alpha_{r+s})=M(\alpha_1,\dots,\alpha_s)\,L(\alpha_{s+1},\dots,\alpha_{s+r}) \tag2$$ The following is a sufficient condition for the functions defined by (1) and (2) to be equal, pointed out in comments by Jonas Meyer.

$(*)\qquad$ $L$ and $M$ are scalar multiples of some tensor powers of a multilinear function $N$.

Since $(*)$ does not require either $L$ or $M$ to vanish, the statement in the book is false.

It hasn't been resolved whether $(*)$ is also a necessary condition for $L\otimes M=M\otimes L$.

  • 0
    I've substantially revised my answer.2013-07-05
0

Consider each linear function as a column vector in the dual space with some basis. Then the tensor product of $n$ such vectors can be visualized as a hyper matrix (or tensor, as it's usually described in physics) of dimension $n$, where each entry is the product of entries in each column vector. Two vectors $a,b$ tensored gives a matrix where each row is a multiple of the first vector, and the multiple of each row is given by the entries in the second column vector.

Tensoring such a matrix with a third tensor gives a cubical matrix where each horizontal slice is a multiple of the base matrix, and the multiples are given by the third column vector, etc.

Note that every single row (i.e. 1-dimensional subset) in the $i$th direction of the tensor hyper matrix is a multiple of the $i$th vector in the tensor product. Either every such row is zero (and the tensor is zero) or we can recover the $i$th vector in the tensor product up to a scalar factor.

This means that in non-zero tensor products, the multiplicands and their order are uniquely determined up to scalar products. Switching the order of tensoring as described in the problem is equivalent to taking an $r+s$ dimensional tensor permuting the order of the multiplicands by a cyclic permutation (I.e. shifting every index by $s$ and modding indices by $r+s$). If the tensor is invariant under such a cyclic permutation, then the multiplicands of the product tensor must be, up to a scalar factor,periodic with period $gcd(s,r+s)=gcd(s,r)$, i.e. the 2 vectors being tensored must be scalar multiples of tensor powers of the same tensor.