0
$\begingroup$

Page 401 of the Nature of Computation flatly states:

Using Gaussian elimination, we can diagonalize $\textbf{X}$ and thus find its eigenvalues and eigenvectors in polynomial time.

$\textbf{X}$ is assumed to be positive definite at an earlier point in the text.
I'm extremely skeptical of this claim. The standard algorithm for diagonalization that I'm familiar with doesn't use Gaussian elimination in a central way and I've heard over and over that finding eigenvalues exactly is basically impossible and one therefore has to rely on numerical approximations. Is the positive definite case somehow an exception?

1 Answers 1

0

short answer: As it stands the sentence seems quite wrong (though I have not looked up the context in the book).

long answer:

1) The eigenvalues of a matrix are generally irrational numbers, so it is generally not possible to write them down exactly, you can only ever do an approximation. positive-definiteness does not change that fact.

2) finding the eigenvalues of a (dense) $n\times n$ matrix with a precision of $k$ digits using the common procedure of a hessenberg-decomposition followed by the (shifted) QR-algorithm(aka "francis method") will typically take about $O(n^3\log(k))$ time (Though the details will certainly depend on condition numbers and such). For hermitian/symmetric matrices you can reduce this to $O(n^3 + \log(k)n^2)$. Also numerical stability is typically better for hermitian/symmetric case.

3) As you point out, the QR-algorithm doesn't have anything to do with gaussian elimination (and honestly the version used in practice doesn't really look like a QR-decomposition either). But there is an older LU-algorithm based on the LU-decomposition instead of QR-decomposition. That one is not in use today because of inferior numerical properties, but one could in principle argue that eigenvalue-decomposition based on gaussian elimination is possible.

4) If you already know an eigenvalue, you can find its eigenvector using Gaussian elimination. But thats not the hard part of the problem.

5) For the sake of completeness: If you consider matrices over finite fields instead of real/complex numbers, then there is indeed a precise polynomial-time algorithm to find all eigenvalues exactly. (that is most certainly not what you or the book is talking about).

  • 0
    Regarding 5) the text actually bypasses the infinity of $\mathbb{R}$ at another point by insisting that the solution only needs to be approximated within $2^{-O(n)}$. Perhaps that's a tacit assumption here too?2017-02-22
  • 0
    That has nothing to do with (5). Finite fields are something completely different from real/complex/rational number (don't worry if you never heard of them, they are rarely used outside of mathematics). But: being approcimated within $2^{-O(n)}$ means $O(n)$ digits. So as I explained in (2), numerical algorithms used in practice do run in polynomial time (typically around $O(n^3)$). Usually the results are very good, but the algorithm is quite a bit more complicated than gaussian elimination.2017-02-22