0
$\begingroup$

I came across the following theorem:

Let $T$ be a linear operator on a vector space $V$, and let $\lambda$ be an eigenvalue of $T$. Suppose that $\gamma_{1}, ... , \gamma_{i}$ cycles of generalized eigenvectors of $T$ corresponding to $\lambda$ such that the initial vectors of the $\gamma_{i}'s$ are distinct and form a linearly independent set. Then the $\gamma_{i}'s$ are disjoint, and their union linearly independent.

In the proof, the following is mentioned.

Let $W$ = span $(\gamma)$ = span $(\bigcup \gamma_{i})$. Then $W$ is $T - \lambda I $ invariant.

I'm not sure why this is correct. Let $x \in W$. Assume $x \in \gamma_j$ such that $x$ is the initial vector of the cycle. That is, if the cycle is generated by the vector $a$ and $(T - \lambda I )^p(a) = 0$, then $x = (T - \lambda I )^{p-1}(a)$.

But then $(T - \lambda I)(x) = (T - \lambda I )^p(a) = 0$ and $0 \notin \gamma_i$ since $(T - \lambda I )^p(a) \notin \gamma_i$.

What am I missing out on?

1 Answers 1

0

$0$ may not be in the set of generalized eigenvectors, but it is in the span of them. The span is the vector subspace generated by those vectors. Vector spaces, subspaces, and spans always contain $0$. You may think of $0$ is the linear combination of none of the generating vectors, if that kind of thinking is to your liking. Otherwise just include it by fiat.