I want to know, since the covariance matrix is symmetric, positive, and semi-definite, then if I calculate its eigenvectors what would be the properties of the space constructed by those eigenvectors (corresponds to non-close-zero eigenvalues), is it orthogonal or anything else special? Suppose this eigenvector matrix is called U, then what would be the properties with U*transpose(U)?
For a covariance matrix, what would be the properties associated with the eigenvectors space of this matrix?
2
$\begingroup$
linear-algebra
statistics
-
0One can always set things up such that the matrix of eigenvectors of a symmetric positive semidefinite matrix is an orthogonal matrix, zero eigenvalues or not. – 2011-04-16
-
0How to set things up? and what would be the U*transpose(U) – 2011-04-16
-
0Most eigenroutines would generate an orthogonal matrix of eigenvectors. *Mathematica* and MATLAB (due to how LAPACK routines are set up) do. Remember that multiplying an orthogonal matrix with its transpose gives an identity matrix. – 2011-04-16
-
0@J.M.: Do you know what the eigenvectors of a symmetric positive semidefinite matrix mean if the matrix is not necessarily a covariance matrix? – 2011-04-17
-
0@Mitch: I tend to think of those things geometrically, much like Qiaochu's answer [here](http://math.stackexchange.com/questions/9758/9763#9763)... or you had something else in mind? – 2011-04-17
2 Answers
2
A symmetric matrix has orthogonal eigenvectors (irrespective of being positive definite - or zero eigenvalues). Hence, if we normalize the eigenvectors, U * transpose(U) = I
-
1This assumes, of course, that orthogonal bases have been selected for any degenerate eigenspaces. For instance, $(1,0)$ and $(\sqrt{1/2}, \sqrt{1/2})$ form a normalized eigenbasis for the 2 by 2 identity matrix but $U U^T \ne I$. – 2011-04-17
-
0@whuber: You're right. – 2011-04-17
1
The eigenvectors correspond to the principal components and the eigenvalues correspond to the variance explained by the principal components.