1
$\begingroup$

If $A$ is a general invertible square matrix with $n$ rows, the computational complexity of inverting $A$ is at least $O\left(n^2\ln n\right)$ and is at most $O\left( n^k\right),\,k\approx 2.373$. Some problems, such as linear least squares, require inverting a matrix of the form $X^TX$, which is self-transpose. Can this special case be handled more efficiently?

2 Answers 2

1

In practice, one inverts arbitrary matrices using (a potentially fancy version of) $LU$ decomposition, where as matrices of the form $X^TX$ can be inverted by Cholesky decomposition. As the wiki article for the Cholesky decomposition states:

when it is applicable, the Cholesky decomposition is roughly twice as efficient as the LU decomposition for solving systems of linear equations. (Press, William H.; Saul A. Teukolsky; William T. Vetterling; Brian P. Flannery (1992). Numerical Recipes in C: The Art of Scientific Computing (second edition). Cambridge University England EPress. p. 994.)

I assume that what was true here 20 years ago is probably true now.

0

Hint: The calculation of the inverse of a matrix $M$ implies the calculation of the determinants of minors $M_{i,j}$ (the minor obtained by removing the $i$-th row and the $j$-th column). For a symmetric matrix we have $\det(M_{i,j})=\det(M_{j,i})$ so that instead of $n \times n$ determinants of minors only $n \times(n+1)/2$ have to be calulated. Moreover the minors of the diagonal are symmetric themselves.

  • 1
    It sounds like we can only use that to reduce the complexity by an $O\left( 1\right)$ factor.2017-02-21