8
$\begingroup$

The question is prompted by change of basis problems -- the book keeps multiplying the bases by matrix $S$ from the left in order to keep subscripts nice and obviously matching, but in examples bases are multiplied by $S$ (the change of basis matrix) from whatever side. So is matrix multiplication commutative if at least one matrix is invertible?

  • 0
    Though I guess $S$ has to be invertible in this case too, since it is changing a set of independent vectors to another set of independent vectors... But the question still persists (not just with respect to change of basis matrices I guess).2011-02-11
  • 0
    Only diagonal matrix commute with other matrix.2011-02-11
  • 0
    @Yuan: That's not true. Only multiples of the identity commute with all matrices. Diagonal matrices only commute among each other. In more fancy language: the centre of the matrix algebra is trivial, but the diagonal matrices form a commutative subring.2011-02-11
  • 1
    @Rasmus: Yuan said "only", so I would say that it is true. Rephrased: "If a matrix commutes with every invertible matrix, it is diagonal." Of course, it is better to replace "diagonal" by "scalar" since then the other implication also holds...2011-02-11
  • 5
    Note to the OP: your question is a perfectly fine one, but it's also a question you could probably have answered for yourself if you tried a few examples (in a sense I will not try to make precise here, "most" pairs of invertible matrices do not commute). The practice of testing one's questions out with actual examples is both useful and enjoyable -- in sister disciplines, it is called the "scientific method". I recommend it to you most highly.2011-02-11
  • 2
    @Pete:If the comment was meant this way, I think that it is formulated in a misleading way.2011-02-11

2 Answers 2

13

Definitely not. Yuan's comment is also not correct, diagonal matrices do not necessarily commute with non-diagonal matrices. Consider $$\left[\begin{array}{cc} 1 & 1\\ 0 & 1\end{array}\right]\left[\begin{array}{cc} a & 0\\ 0 & b\end{array}\right]=\left[\begin{array}{cc} a & b\\ 0 & b\end{array}\right] $$

Changing the order I get $$ \left[\begin{array}{cc} a & 0\\ 0 & b\end{array}\right]\left[\begin{array}{cc} 1 & 1\\ 0 & 1\end{array}\right]=\left[\begin{array}{cc} a & a\\ 0 & b\end{array}\right] $$ Which is different for $a\neq b$.

Hope that helps. (Sometimes change of basis matrices can go on different sides for different reasons, but without seeing the exact text you are talking about I can't comment)

  • 0
    Sorry, maybe I make a mistake. The multiplication of a scalar and the identity matrix commute with others. But I forgot the proof.2011-02-11
  • 2
    That is of course true since constants commute, and the identity can be put wherever we want. But a diagonal matrix and a constant multiple of the identity are very different. (I have definitely made that commuting mistake before too, it seems quite reasonable actually...)2011-02-11
  • 0
    Diagonal matrices are very good examples. A diagonal matrix with distinct diagonal entries commutes *only* with other diagonal matrices.2011-02-11
9

In general, two matrices (invertible or not) do not commute. For example $$\left(\begin{array}{cc} 1 & 1\\ 0 & 1\end{array}\right)\left(\begin{array}{cc} 1 & 0\\ 1 & 1\end{array}\right) = \left(\begin{array}{cc} 2 & 1\\ 1 & 1\end{array}\right) $$ $$ \left(\begin{array}{cc} 1 & 0\\ 1 & 1\end{array}\right)\left(\begin{array}{cc} 1 & 1\\ 0 & 1\end{array}\right) = \left(\begin{array}{cc} 1 & 1\\ 1 & 2\end{array}\right)$$

Also, to change a basis you usually need to conjugate and not just multiply from the left (or just right).

What you do know is that a matrix A commutes with $A^n$ for all $n$ (negative too if it is invertible, and $A^0 = I$), so for every polynomial P (or Laurent polynomial if A is invertible) you have that A commutes with $P(A)$.

  • 0
    This is a good answer, +1. Not that it is much different than mine, but I don't know why some answers are never voted.2011-02-11