So I know that you can use the Strassen Algorithm to multiply two matrices in seven operations, but what about multiplying two matrices that are exactly the same. Is there a faster way to go about doing that (ie. by reducing the number of multiplications per iteration to something less than 7) ?
What's the fastest way to take powers of a square matrix?
-
0The matrix has only real entries. – 2011-02-07
2 Answers
Consider $M = \begin{pmatrix} 0 & A & 0 \\ 0 & 0 & B \\ 0 & 0 & 0 \end{pmatrix}.$ We have $M^2 = \begin{pmatrix} 0 & 0 & AB \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}.$ So matrix squaring is not asymptotically better than regular product. Of course, in practice the constant factor difference is very significant.
Taking a power is best achieved by repeated squaring. The other method is diagonalizing, but I think usually eigenvalues are found via the power method rather than vice versa.
-
0You can also find this argument in CLRS (I think). – 2011-02-08
It really depends on the matrix. For instance if you have a diagonalizable matrix then you can decrease the amount of multiplications by diagonalizing so that
$A^k=P^{-1}B^k P,$
and since $B$ is a diagonal matrix it only takes $2$ multiplications to multiply the matrices. Somewhat similar things can be done with jordan normal form matrices. A general method consists of using binary exponentiation to reduce the number of matrix multiplications that need to be done from N to $\log(N)$. Technically speaking matrix multiplication can be done "faster" than Strassen as well, but this will only be the case for very large matrices, due to the large constant coefficient hidden in the Coppersmith–Winograd algorithm.