3
$\begingroup$

Anyone know of a linear transformation for which there does not exist a basis of eigenvectors?

What would indicate to me that a particular linear transformation has/hasn't a basis of eigenvectors? It seems like if a linear transformation isn't invertible then it wont have a basis of eigenvectors. But I can't think of a linear transformation that isn't invertible.

  • 6
    Rotation by $90^\circ$ in $\mathbb{R}^2$ has no eigenvectors.2012-04-28
  • 1
    Here's a fundamental example of a non-invertible linear transformation you should keep in mind: for any vector space $V$ of positive dimension, the map $Z:V\to V$ with $Z(v)=0$ for all $v\in V$ is linear and is not invertible.2012-04-28
  • 1
    But on the other hand, every vector is an eigenvector of that map.2012-04-28
  • 1
    If the matrix is not invertible, it will definitely have at least one eigenvector, any vector in the kernel is an eigenvector with eigenvalue 0.2012-04-28
  • 0
    @Jim_CS: You may be interested to read about [generalized eigenvectors](http://en.wikipedia.org/wiki/Generalized_eigenvector).2012-04-28
  • 0
    @HenryT.Horton Why is it only rotation in $\mathbb R^2$, I would have thought that rotation in $\mathbb R^3$ also changes the direction of the vector so eigenvectors aren't possible...?2012-04-29
  • 1
    @Jim_CS: I didn't claim my example was the only type of rotation with no eigenvectors... However, a rotation in $\mathbb{R}^3$ is always equivalent to a rotation about some fixed axis, and hence a vector pointing in the direction of that axis is an eigenvector of the rotation (see Euler's rotation theorem). The general idea behind my example is to construct a real linear transformation whose characteristic polynomial has no real roots.2012-04-29

2 Answers 2

8

$\begin{bmatrix} 1 & 1\\ 0 & 1\end{bmatrix}$ has only one eigenvalue, 1, with algebraic multiplicity 2 but geometric multiplicity 1. In other words, there is only one eigenvector (up to taking scalar multiples) for this eigenvalue, so there is no a basis of eigenvectors.

To convince yourself that there is no basis of eigenvectors, think what would happen if there were: Since 1 is the only eigenvalue, any vector would then be mapped to itself, but then the matrix in question would have to be the identity matrix.

  • 1
    P.S. What Brett has there is what's commonly called a *Jordan block*.2012-05-02
  • 1
    To add to J.M.'s point, it turns out that this is, up to conjugation and its higher-dimensional analogues, the only thing that can go wrong. You can find out more by looking up Jordan canonical forms.2012-05-02
  • 0
    @BrettFrankel: To be more precise there are two things that can go wrong: (1) the characteristic (or equivalently the minimal) polynomial of $f$ does not split into linear factors (but this cannot happen over an algebraically closed field like $\Bbb C$), or (2) there is some eigenvalue $\lambda$ and a vector $v$ that is _not_ an eigenvector but such that $(\lambda I-f)^2(v)=0$ (as happens here with $\lambda=1$ and $v$ the second standard basis vector). One can show (not so easily) that when (1) is excluded, $f$ has a matrix that has a _list_ of Jordan blocks for each $\lambda$.2012-12-11
1

May I suggest: $\begin{bmatrix} 0 & 1\\ -1 & 0\end{bmatrix}$ onto $\mathbb{R^2}$?!

  • 0
    You may of course suggest it, but it was already mentioned in a comment by Henry T. Horton to the question.2012-12-07
  • 0
    Oh, I'm so sorry. Scuse me.2012-12-07