short answer: As it stands the sentence seems quite wrong (though I have not looked up the context in the book).
long answer:
1) The eigenvalues of a matrix are generally irrational numbers, so it is generally not possible to write them down exactly, you can only ever do an approximation. positive-definiteness does not change that fact.
2) finding the eigenvalues of a (dense) $n\times n$ matrix with a precision of $k$ digits using the common procedure of a hessenberg-decomposition followed by the (shifted) QR-algorithm(aka "francis method") will typically take about $O(n^3\log(k))$ time (Though the details will certainly depend on condition numbers and such). For hermitian/symmetric matrices you can reduce this to $O(n^3 + \log(k)n^2)$. Also numerical stability is typically better for hermitian/symmetric case.
3) As you point out, the QR-algorithm doesn't have anything to do with gaussian elimination (and honestly the version used in practice doesn't really look like a QR-decomposition either). But there is an older LU-algorithm based on the LU-decomposition instead of QR-decomposition. That one is not in use today because of inferior numerical properties, but one could in principle argue that eigenvalue-decomposition based on gaussian elimination is possible.
4) If you already know an eigenvalue, you can find its eigenvector using Gaussian elimination. But thats not the hard part of the problem.
5) For the sake of completeness: If you consider matrices over finite fields instead of real/complex numbers, then there is indeed a precise polynomial-time algorithm to find all eigenvalues exactly. (that is most certainly not what you or the book is talking about).