3
$\begingroup$

I just had a short look on the definition of a Vector Space and couldn't find any obvious reason why

$\lambda a = a \lambda$

where $\lambda$ is an element of the field $K$ and $a$ is an Element of the set $M$, should be true. Its kind of intuitive since all the vectorspaces I normally come across have this property (Polynoms, indefinitly diffrentiable functions, the $\mathbb{R}^n$). However since $\lambda \in K$ and and $a \in M$, I see no justification for this product being commutative. Is this dependend on the definition of the product between the field and the set? Or is this generally true? If not it would be great if you could provide some examples.

Thanks in advance.

  • 2
    Notice that scalr multiplication is usually defined as a function $K\times\mathbf{V}\to\mathbf{V}$, where we denote the image of the ordered pair $(\lambda,\mathbf{v})$ by $\lambda\mathbf{v}$; as such, scalar multiplication is always "a scalar times a vector", and never "a vector times a scalar" (though one can *define* such an operation, as Qiaochu says); that's why we almost invariably write the scalars always on the same side. In the vector space case, there are no problems; one can define analogous structures where the scalars are taken from division rings; then the side *does* matter.2011-01-08

2 Answers 2

8

Scalar multiplication is generally not defined on the right. You can define $a \lambda = \lambda a$, and then it will of course be true.

  • 0
    But if we have an abstract vector space, what does it mean to "multiply" by a linear map on the right? (Could we construct a sensible vector space with left and right actions by the endomorphism ring where left and right scalar multiplication disagree?)2011-01-08
  • 0
    @Zhen Lin: whoops. Ignore that.2011-01-08
  • 0
    isn't the field-module condition enough to ensure commutativity of the scalar product, as I said in my answer? My professors always said it was so.2011-01-08
  • 0
    @Andy: the condition ensures that you _can_ do this, yes. But I don't see any reason you _have to._2011-01-08
  • 0
    @Qiaochy Yuan: I find it kind of useful if you're transposing things, something like $(\lambda x)_T = x_T \lambda = \lambda x_T$. Of course, this is a particular case, maybe in general it's not really necessary :)2011-01-08
0

A vector space is a particular ring module. It's a right module of a field over and abelian group. In this case, since it is defined over a field (which is commutative by definition) the left module (where the multiplication by a scalar is defined on the left) and the right module (where the multiplication is defined on the right) coincide (it is a bimodule).

For the definition of R-modules you can look at wikipedia.

  • 2
    For the record, this is not the only compatible F-F bimodule structure on an F-vector space (left F-module). For any nontrivial automorphism sigma : F \to F, you can define a lambda = sigma(lambda) a, and this defines an F-F bimodule structure where the left and right actions disagree.2011-01-08
  • 0
    Dear Andy, While you are correct that you *can* regard a left module $M$ over a commutative ring $A$ as a right module just by defining $m \cdot a := a \cdot m$ for $a \in A$ and $m \in M$, this a particular choice of bimodule structure on $M$. As Qiaochu notes in his comment, it is also quite possible to have $(A,A)$-bimodules in which the left and right actions are distinct. Regards,2011-01-08
  • 0
    Yes of course, but I was just using terminology to familiarise the OP with these kind of structures (beacuse I assumed he didn't have any contact with modules before). I don't think any of the observations I made is wrong, so I can't really understand the downvote.2011-01-08