1
$\begingroup$

I have a math problem I am struggling with:

If a linear transform $A: \mathbb{R}^n\to\mathbb{R}^n$ and we have a basis of $\mathbb{R}^n$ of eigenvectors of $A$, can't we just orthonormalize them and get a matrix $P$ such that $P^{-1}=P^\mathrm{T}$ and thus $P^\mathrm{T}AP$ is diagonal?

Solution (so far):

Let $B$ be the matrix of eigenvectors of $A$.

Now, I know that if $B$ is orthogonalizable, then the rest of the problem is true, and I also know that B is diagonalizable, because it is a basis of $\mathbb{R}^n$ and is therefore linearly independent. I also know (from prodding the professor) that the conjecture is false (i.e. we can't just orthonormalize $B$). I can't see how the process of normalizing the vectors in a matrix would cause a problem, so it must come down to whether the [linearly independent] vectors of $B$ are orthogonalizable.

Question:

Which matrix of linearly independent vectors is not orthogonalizable? (and why?) Thanks

  • 1
    http://en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_process2012-11-09
  • 1
    A good advice is to try Latex code rather than -->. ^^2012-11-09
  • 0
    What do you mean by "orthogonalizable"2012-11-09
  • 0
    If you have a basis, you can obtain an orthornormal basis by a process called Gram-Schmidt process. Check this out at wikipedia, you'll see the details.2012-11-09
  • 0
    I know about gram-schmidt. But that doesn't tell you when a matrix isn't orthogonalizable, except if the norm is zero, which it can't be.2012-11-09
  • 0
    By orthogonalizable, I mean make the vectors orthogonal.2012-11-09

2 Answers 2

1

The thing is, the eigenspaces are sometimes fixed in direction. If you have two single dimensional eigenspaces which are not orthogonal to begin with, how are you going to orthogonalize them?

Orthogonally diagonalizables take on a very specific form. Suppose that $A$ is orthogonally diagonalizable. Then there is an orthogonal matrix $P$ and diagonal matrix $D$ such that $$A = PDP^{\mathrm{T}}$$ But then this implies $$A^{\mathrm{T}} = PDP^{\rm{T}}=A$$ So $A$ is symmetric. From the principal axis theorem, it follows that a real matrix is orthogonally diagonalizable if and only if it is symmetric.

  • 0
    So are you implying that if the basis of eigenvectors of a matrix which is not symmetric cannot be orthogonalized, ie. put through the Gram-Schmidt process?2012-11-09
  • 0
    Just to clarify, the basis of eigenvectors needs to be symmetric, not the matrix itself.2012-11-09
  • 0
    I also don't see how you equated A^t to PDP^t.2012-11-09
  • 0
    If you want a basis of _orthonormal_ eigenvectors then $A$ is necessarily diagonal. As for $A^{\mathrm{T}}$, I suggest you study how the transpose operator acts on products of matrices.2012-11-09
  • 0
    Sorry, replace "$A$ is necessarily _diagonal_" with "$A$ is necesarily _symmetric_" in the above comment.2012-11-09
  • 0
    @EuYu , do you think a more general statement is $A$ should be normal to be orthogonally diagonalizable which contains symmetric and skew-symmetric cases.2012-11-10
  • 1
    @dineshdileep Over the complex numbers, normality is indeed the condition. For the real numbers, normal is equivalent to being symmetric so I didn't bring it up.2012-11-10
  • 0
    @EuYu So how about skew symmetric matrices, aren't they diagonalizable.2012-11-10
  • 1
    @dineshdileep all eigenvalues of a skew symmetric matrix are imaginary. While they ate diagonalizable over the complex numbers they are not diagonalizable over the reals.2012-11-10
  • 0
    @EuYu yes, i understood!!2012-11-10
1

For a general diagonalisable matrix $A$, if you take a basis of eigenvectors and orthonormalise it, what you get is an orthonormal basis of generally non-eigenvectors (except fot the first one, which has kept its direction). The Gram-Schmidt process knows nothing about $A$, and linear combinations of eigenvectors (for different eigenvalues) are not eigenvectors; if they were then for a diagonalisable matrix all vectors would be eigenvectors, which is evidently false.

It is also asy to see that you cannot hop to find an orthonormal basis of eigenvectors unless all eigenspaces are orthogonal to each other (each eigenspace is spanned by a subset of the orthonormal basis of eigenvectors, and for any two eigenspaces, all these spanning vectors for one are orthogonal to those for the other). As EuYu explains, this condition of having orthogonal eigenspaces (in your real-vector space setting) is satisfied if and only if the matrix is symmetric.