11
$\begingroup$

The wiki article on eigenvectors offers the following geometrical interpretation:

Each application of the matrix to an arbitrary vector yields a result which will have rotated towards the eigenvector with the largest eigenvalue.

Qn 1: If there is any other geometrical interpretation particularly in the context of a covariance matrix?

The wiki also discusses the difference between left and right eigenvectors.

Qn 2: Do the above geometrical interpretations hold irrespective of whether they are left or right eigenvectors?

  • 0
    @Kaestur Ha$k$arl - that link is broken :(2012-07-03

4 Answers 4

12

Here is a partial answer in the case where M is a real symmetric matrix. This is to ensure, by the real spectral theorem, that M has real eigenvectors with real eigenvalues, so there is a chance for a genuine geometric interpretation which stays in $R^n$.

M acts on the unit sphere in $R^n$ in the following way: it sends the unit sphere $v^T v = 1$ to $v^T (M^T M) v = 1$ . This modified shape is not generally a sphere, but is generally an ellipsoid. The axes of this ellipsoid are the eigenvectors of M, and the sizes of each axis are given by the squares of the corresponding eigenvalues.

  • 0
    Fair enough but I am ok with your answer.2010-07-29
6

If you are interested in covariance matrices, then the eigenvectors of the covariance matrix tell you how to change variables to make your random variables uncorrelated.

Specifically, let $M$ be a covariance matrix of the random variables $X_1,...,X_n$. For simplicity, lets assume that all of these random variables are zero mean. Lets also define $X$ to be the random vector whose $i$"th component is the random variable $X_i$. Let the eigenvectors of $M$ be $v_1, v_2, ..., v_n$. We can assume that these are orthogonal since $M$ is symmetric. Consider the random variables $Y_i$ obtained by taking the dot product of $v_i$ and $X$.

Then the random variables $Y_1, ..., Y_n$ are uncorrelated!

Indeed:

$ E[ Y_i Y_j] = E[ v_i^T X X^T v_j] = v_i^T M v_j = \lambda_j v_i^T v_j = 0 $

  • 1
    that's what i was looking for. good show!2010-07-29
6

Instead of giving an answer, let me point out to you this chapter in Cleve Moler's book "Numerical Computing with MATLAB", there is a nice geometric demonstration in MATLAB on how eigenvalues/eigenvectors (as well as singular values/vectors) of an order-2 square matrix are involved in how a circle is transformed into an ellipse after a linear transformation represented by the matrix.

  • 0
    Lo and behold, [the screenshot](http://i.imgur.com/dwZgd.png).2012-08-01
2

Of course! Consider a coordinate transformation of rotation and/or scaling (but not translation):

v = Au 

where v and u are vectors, and A is a transformation matrix. Then the eigenvectors, if they have real components, are the axes which are left unrotated (scaling only) by the transformation. (see wikipedia)

A covariance matrix is a symmetric, positive definite matrix, so it has orthonormal eigenvectors, and these form a tuple of axes; I am fairly sure the eigenvectors form a new basis of linear combinations of the input variables where the basis variables are uncorrelated, but I can't remember how to show this.

For example, if w1 = [x;y] is a pair of independent unit-variance zero-mean Gaussian random variables, consider w2 = [u;v] = [1 1; 2 1][x;y] = (x+y,2x+y), so that w1 = [-1 1;2 -1][u;v] = [v-u;2u-v]. Then cov(w2) = [2 3; 3 5]. This has eigenvectors which have sqrt(5) in them, hmmmm...

As for question 2, I'm not sure.

  • 0
    I added the section on covariance matrices. forgot how to show what I wanted to show, though.2010-07-29