I have a very simple question that can be stated without proof. Are all eigenvectors, of any matrix, always orthogonal? I am trying to understand Principal components and it is cruucial for me to see the basis of eigenvectors.
orthogonal eigenvectors
-
8No. Take any any non-orthogonal basis $(v_1,\dots,v_n)$ and define a linear map $A$ on this basis by sending each $v_i$ to $iv_i$. The eigenspaces are the $n$ lines generated by the $v_i$, and these are by construction not ortgogonal. – 2012-05-08
-
11Eigenvectors corresponding to different eigenvalues will be orthogonal if the matrix is symmetric. This is part of the real spectral theorem. – 2012-05-08
-
1The case @Dylan is describing will apply to your study of principal components, since the underlying matrices are symmetric... – 2012-05-08
3 Answers
Fix two linearly independent vectors $u$ and $v$ in $\mathbb{R}^2$, define $Tu=u$ and $Tv=2v$. Then extend linearly $T$ to a map from $\mathbb{R}^n$ to itself. The eigenvectors of $T$ are $u$ and $v$ (or any multiple). Of course, $u$ need not be perpendicular to $v$.
-
0Thank you very much for your answers. Could anyone state whether they are orthogonal in PCA case? – 2012-05-08
-
2For PCA, things can always be set up such that the eigenvectors are orthogonal. On the other hand, I would recommend looking at PCA as a singular value decomposition instead of as an eigendecomposition. It's been discussed here on math.SE a number of times; search around. – 2012-05-08
-
0What do you mean by "setting up"? Is there some common technique to achive a singular matrix? – 2012-05-09
-
2My understanding based on https://en.wikipedia.org/wiki/Principal_component_analysis is that a singular value decomposition (SVD) of $X$ results in three matrices, where the first of them is composed of the eigenvectors of $X^T X$, and is used for PCA. This matrix is always orthogonal. – 2013-06-14
-
0Beautiful answer. – 2013-11-13
-
0what is PCA? Hmm... So that's a counter example. – 2016-06-06
In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.
For any matrix M with n rows and m columns, M multiplies with its transpose, either M*M' or M'M, results in a symmetric matrix, so for this symmetric matrix, the eigenvectors are always orthogonal.
In the application of PCA, a dataset of n samples with m features is usually represented in a n* m matrix D. The variance and covariance among those m features can be represented by a m*m matrix D'*D, which is symmetric (numbers on the diagonal represent the variance of each single feature, and the number on row i column j represents the covariance between feature i and j). The PCA is applied on this symmetric matrix, so the eigenvectors are guaranteed to be orthogonal.
In the context of PCA: it is usually applied to a positive semi-definite matrix, such as a matrix cross product, $X ' X$, or a covariance or correlation matrix.
In this PSD case, all eigenvalues, $\lambda_i \ge 0$ and if $\lambda_i \ne \lambda_j$, then the corresponding eivenvectors are orthogonal. If $\lambda_i = \lambda_j$ then any two orthogonal vectors serve as eigenvectors for that subspace.