1
$\begingroup$

So I asked a question recently:

Finding the matrix of this linear transformation

and I'm wondering something else.

$T : V \to V$ is a linear transformation. There is a vector $v \in V$ such that $T^n(v) = 0$. We're also told that the vectors $T^{n-1}(v), T^{n-2}(v), \ldots, T(v), v$ form a basis for $V$.

Let's also add that $T^{n-1}(v) \neq 0$

What if we're not told what the basis is? How do you find a basis for the transformation, or at least prove that it is a basis?

In general, a set of vectors is a basis for a space if the vectors are linearly independent and they span the space. How can we show that if we're not given what the actual vectors are?

Edit: My question boils down to: How do I show that those vectors form a basis for V, if we don't even know what V is?

  • 1
    a) What does "=/=" stand for? b) There is no such thing as a basis for a transformation. A basis is a basis for a vector space. c) Your question is extremely general. In different scenarios, one might use all sorts of methods to find a basis or prove that a given set is a basis. Do you have anything more specific in mind?2011-12-14
  • 2
    The fact that $T^{n-1}(v)\neq 0$ follows from "$T^{n-1}(v)$, $T^{n-2}(v),\ldots,T(v),v$ form a basis for $V$". No basis can include the zero basis.2011-12-14
  • 0
    Above, for "zero basis", read "zero vector".2011-12-14
  • 0
    I follow that. I guess my question can be revised to "How do I show that those vectors are indeed a basis for V?"2011-12-14
  • 1
    @JohnDoe: In this generality, **you cannot**. There is not enough information to guarantee it.2011-12-14
  • 0
    @joriki: Oops; quite so. Thank you!2011-12-14

1 Answers 1

4

There is no such thing as a "basis for the transformation". In this context, Vector spaces have bases, not linear transformations.

What you are looking at is the beginning of the systematic study that leads to the Rational Canonical Form of a transformation.

Though it takes quite a bit of doing, one can prove that for any finite dimensional vector space $V$, and any linear transformation $T$, there exist vectors $v_1,\ldots,v_k$, and positive integers $n_1,\ldots,n_k$ such that:

  1. $v_i$, $T(v_i)$, $T^2(v_i),\ldots,T^{n_i-1}(v_i)$ are linearly independent for each $i$.
  2. $T^{n_i}(v_i)$ is a linear combination of $v_i,T(v_i),\ldots,T^{n_i-1}(v_i)$ for each $i$.
  3. $v_1,\ldots,T^{n_1-1}(v_1),v_2,\ldots,T^{n_2-1}(v_2),\ldots,T^{n_k-1}(v_k)$ is a basis for $V$.

These form "nice" bases when the matrix is not diagonalizable and when there is no Jordan canonical form available. (There are a few other properties that go into the rational canonical form besides the 3 above).

  • 0
    I asked my teacher today, and she showed an inductive argument that shows that the last two vectors $T^{n-1}(v), T^{n-2}(v)$ are linearly independent, and you can generalize it to show that all the vectors are linearly independent. Since we have n lin. indep. vectors in n-dimensions, they must form a basis, and the argument was done. Is that what you were referring to w.r.t. the work to prove 1,2,3?2011-12-14
  • 0
    @John: I don't know what it was your teacher showed you, but there were more assumptions than simply the fact that you had a linear transformation; it is *not* the case that for every linear transformation you can find a v for which what you claim holds, so there were extra assumptions at play.2011-12-15