Let $V$ be a vector space and $T : V\to V$ a linear transformation with the property that $T(W)\subset W$ for every subspace $W$ of $V$. How can we prove that there is an element $\lambda$ in the field of scalars such that $T(v) = \lambda v$ for all $ v \in V .$
eigenvector proof
-
4And [another](http://math.stackexchange.com/q/116223/742) strange bounty offer. What is "non-credible" about the answers given? That they didn't spoon-feed you the full answer in complete detail? If you want more details, ask for details. Saying the answers you received are "not credible" is rather insulting. As for "official sources", sorry, I won't fax you a copy of my degree, but it's pretty official. It says I have "all the rights and priviledges thereto pertaining", so I'd say I'm an official source. – 2012-03-23
2 Answers
Here are 2 hints.
Given $0\neq v\in V$, let $W = \text{span}\{v\}$. The hypothesis on $T$ imply what about $v$?
Assume for a contradiction that $v$ and $w$ are nonzero vectors and $Tv = \lambda v$ and $Tw = \mu w$ with $\lambda \neq \mu$. If $W = \text{span}\{v+w\}$, then what happens?
If $\mathbf{v}\neq\mathbf{0}$, let $\mathbf{W}=\mathrm{span}(\mathbf{v})$. Then $T(\mathbf{v})\in \mathbf{W}$, so $T(\mathbf{v}) = \mu_{\mathbf{v}}\mathbf{v}$ for some scalar $\mu_{\mathbf{v}}$ which, for all we know right now, may depend on $\mathbf{v}$.
If $\dim(\mathbf{V})\leq 1$, this suffices. (Why?)
Now, assume $\mathbf{v}\neq\mathbf{w}$ are two nonzero vectors that are linearly independent Then: $\mu_{\mathbf{v}+\mathbf{w}}(\mathbf{v}+\mathbf{w}) = T(\mathbf{v}+\mathbf{w}) = T(\mathbf{v}) + T(\mathbf{w}) = \mu_{\mathbf{v}}\mathbf{v} + \mu_{\mathbf{w}}\mathbf{w}.$ That means, $(\mu_{\mathbf{v}+\mathbf{w}} - \mu_{\mathbf{v}})\mathbf{v} = (\mu_{\mathbf{w}}-\mu_{\mathbf{v}+\mathbf{w}})\mathbf{w}.$ Given that we are assuming $\mathbf{v}$ and $\mathbf{w}$ are linearly independent, what can we conclude from this?