1
$\begingroup$

I've been doing functional programming for a long time, and in that discipline, which is based on lambda calculus and therefore on function application and composition, the juxtaposition of f to x, as in f x, is called the "application of f to x" just as f(x) is.

Matrix multiplication seems to follow similar syntactical intuition, with the multiplication of a matrix on the left against another matrix (or a vector) on the right being another matrix or vector using the left matrix's values as the coefficients of a linear function applied to the right hand side.

Specifically multiplying a matrix against another matrix is called matrix "composition," even, which happens to be the same term used to describe the application of a function to another function.

So then given that similarity to function application, the fact that matrix multiplication involves more than just the multiplication of each of the elements, and the fact that it isn't commutative the way anything described prior and called "multiplication" presumably would have been, is it simply due to historical reasons that matrix multiplication isn't called "matrix application"? Is there some characteristic of matrix multiplication which thinking of it as multiplication rather than as function application help understand?

  • 0
    It may not be commutative, but it is associative and it is distributive with regard to addition, so it is a bit like multiplication...2017-02-20
  • 0
    @5xum There are a lot of things that are distributive and associative, involve some multiplication, but aren't called multiplication, though. That's what throws me. If it had most of the commonly recognized properties of scalar multiplication, the name would click better -- at least for me.2017-02-20
  • 0
    [Wikipedia](https://en.wikipedia.org/wiki/Matrix_multiplication) : "When two linear transformations are represented by matrices, then the matrix product represents the composition of the two transformations." The justification of your second paragraph right there.2017-02-20

1 Answers 1

3

While Matrices can be viewed as linear operators (functions) on a vector space, it is a more general fact that $n \times n$ matrices have a natural monoid structure, so it makes sense to speak of their "composition" as functions in terms of their "multiplication" as a monoid (or equivalently, a semigroup with identity.)

More common is the fact that if $GL_n$ is the space of invertible $n \times n$ matrices, this forms a group under function composition, so it also makes sense to speak of matrix "multiplication."

Multiplication is a more general term, that need not be commutative, so long as it satisfies the group axioms

  • 1
    Awesome, persuasive, informative answer. Thanks!2017-02-20
  • 0
    No problem :). I'm glad that it was convincing.2017-02-20