2
$\begingroup$

I would like to do an eigenvalue decomposition on a matrix $A^{\top} A$ - positive definite. Eigenvalue decomposition algorithms typically give eigenvectors which are orthogonal to each other, even when the eigenspaces have dimension larger than 1.

However, I would like the resulting eigenvectors $U$ to be orthonormal to each other (i.e. $A^{\top}A = U^{\top} \Sigma U$) under a different inner-product, and not the regular dot product.

Is there an eigendecomposition algorithm that does that? Or should I just use eigenvalue decomposition as usual and then orthonormalize each eigenspace separately using Gram-Schmidt? Is there more natural way to do it?

(note that I assume that eigenvalues from different eigenspaces will be orthogonal under the new inner-product, since the inner product is $\langle x,y\rangle = x^{\top} B^{\top}By$, and $Bu_i$ is an eigenvector of another matrix for any $u_i$ column of $U$ with the corresponding eigenvalue from $\Sigma$).

  • 0
    Look up "generalized singular value decomposition".2012-08-15
  • 0
    I just looked it up in wikipedia: en.wikipedia.org/wiki/Generalized_singular_value_decomposition. I am not sure how it is related? thanks.2012-08-15
  • 0
    Hmm, the discussion there is somewhat spotty. See [this](http://benisrael.net/VAN-LOAN-GENERALIZED-SVD.pdf) instead.2012-08-15
  • 0
    For normal matrices, their eigenvectors from different eigenspaces are already orthogonal (regular inner product) to each other, how can these eigenvectors be orthogonal under new inner product?2012-08-16

0 Answers 0