1
$\begingroup$

I know that the eigenvalue's number is less than the dimension of a matrix, but as the intuition of the eigenvector, each eigenvector keeps the original direction after a linear transform. I think in $\mathbb{R}^n$ there are $n$ vectors which can do this. Why is this not true? Could anyone please give a intuitive explanation of why the eigenvector's number is less than the dimension, or geometric multiplicity is less than algebraic multiplicity?

  • 1
    Well, you certainly couldn't have *more* eigenvalues than independent directions!2012-12-03

1 Answers 1

2

Less or equal. And, not exactly the 'number' of the eigenvalues, but the sum of the dimensions of the eigenspaces.

The best is to understand it by simple examples.

  1. The identity $\Bbb R^n\to\Bbb R^n$ fixes every vector, so everyone (except $0$) is an eigenvector with eigenvalue $1$ (there are infinitely many of them), spanning the whole space, that is, dimension $n$.
  2. Similarly the reflection about the origo: $x\mapsto -x$ in $\Bbb R^n$: every (nonzero) vector is eigenvector with eigenvalue $-1$.
  3. As James S.Cook commented, the rotation in $\Bbb R^2$ doesn't have any (real) eigenvalue. So this case, it is indeed less.
  4. Take the 'toppling' funtion in the plane: $(x,y)\mapsto (x+y,y)$. Then you can calculate that it has only $1$ as eigenvalue with a $1$ dimensional eigenspace: the $x$-axis.

And, why is the sum of dimensions of eigenspaces of a transformation $A$ is less or equal than the dimension? It is basically because if $\lambda\ne\mu$, then the eigenspaces $E_\lambda:=\{x\mid Ax=\lambda x\}$ and $E_\mu$ are disjoint: $E_\lambda\cap E_\mu=\{0\}$.

  • 0
    Disjointness (after removing the null vector) is too weak. In $\Bbb R^2$ on can easily find $3$ subspaces of dimension $1$ (or indeed infinitely many) that are disjoint in this sense. The proper term is "direct sum". And yes, eigenspaces for a collection of distinct eigenvalues always form a direct sum.2018-04-18