3
$\begingroup$

My idea of proving every real symmetric matrix can be diagonalized is that, first prove two eigenvectors with different eigenvalues must be orthogonal, then I failed to prove that all the eigenvectors span the whole vector space.

To be specific, my question is, if $A$ is a real symmetric $n\times n$ matrix, let $p(t)=\det(tI-A)$ be the characteristic polynomial of $A$, and $\lambda$ be some eigenvalue of $A$, and $\lambda$ is a root of $p(t)$ of order $k$, then how to prove $\dim (\ker(\lambda I-A))=k$?

  • 0
    Do you already know a proof of the theorem and want to find your own using this particular caracterisation of diagonalizability, or do you ask for any proof of this fact? I ask because it seems to me a little unsuited, but comments and answers may prove me wrong, to go about proving the theorem uaing this criterion of diagonalizability.2012-05-03
  • 0
    @OlivierBégassat Yes, I do know a proof, yet I find the proof in my textbook seems delibrately avoid the question I put here, so I'm here asking for help~2012-05-03
  • 0
    What is the gist of your textbook proof? One way to prove it is to use the fact that any real endomorphism stabilizes a line or a plane (possibly both), to show that the induced endomorphism remains symmetric, and that the orthogonal complement is also stabilized, and to use recursion. there are slightly easier proofs along the same lines...2012-05-03
  • 0
    Do you know what is relation between the multiplicity of $\lambda$ and $\dim (\ker(\lambda I-A))=k$?2012-05-03
  • 0
    On another note: you are aware that it is possible for a symmetric matrix to have repeated eigenvalues, and yet still have orthonormal eigenvectors, right?2012-05-03
  • 0
    Here is one possible way to go about. You should first show that there are only real eigenvalues. Then you know $A$ to be trigonalizable. Then use the Gram-Schmidt process to orthonormalize this basis. Because of how this process is done, the orthonormal basis you end up with is stil a basis of trigonalization. But being an orthonormal basis, the matrix has to be symmetrical. As a result, your matrix is both upper triangular and symmetrical thus is diagonal. At first I wanted to look at symmetrical nilpotent matrices, but this is quicker, albeit a little furthee away from your request.2012-05-03

1 Answers 1

3

There are obviously many ways to prove your statement. Some of the comments suggest to use the following:

Let $v$ be an eigenvector to the eigenvalue $\lambda$. Set $U=\mathbb R v$ and write $\mathbb R^n=U\bot U^\bot$. Then $A(U)\subseteq U$ and $A(U^\bot)\subseteq U^\bot$ (you use symmetry here). Therefore we can restrict $A$ to $U^\bot$, get a symmetric matrix and proceed by induction. This is somewhat the standard proof of the spectral theorem.

Another way - I think a way closer to what you asked - would be the following lemma: If $x\in Ker(\lambda I-A)^k$, then $x\in Ker(\lambda I-A)$. For simplicity I will proof the case $k=2$. Let $x\in Ker(\lambda I-A)^2$ and $y=(\lambda I-A)x$. We want to show $y=0$. We have $$\lambda x=Ax+y$$ and $$\lambda y=Ay$$ It follows $$\lambda \left=\left=\left+\left=\left+\left=\lambda\left+\left$$ Which means $y=0$.

If you believe that there is a basis of generalised eigenvectors you are done.