1
$\begingroup$

Prove or disprove:

Theorem. Let $V$ be a finite-dimensional vector space and $T:V\to V$ a linear transformation. Then the following are equivalent.

(1) There exist non-trivial $T$-invariant subspaces $U_1,U_2$ of $V$, such that $V=U_1\oplus U_2$.

(2) $V$ has a non-trivial $T$-invariant subspace.

It's obvious that (1) implies (2).

For the converse, suppose that $U$ is a non-trivial $T$-invariant subspace of $V$. The obvious argument for proving (1) is to choose $U_1=U$ and then find a suitable $U_2$. However I have an example (leave a comment if you want details) which shows that this approach cannot work in general. The example works if you take $U_1$ to be a minimal (non-zero) $T$-invariant subspace of $U$ instead of $U$ itself, but I have not been able to turn this idea into a general proof.

2 Answers 2

1

One counterexample is a nonzero nilpotent operator $T$ on a $2$ dimensional space. In such a case, a nontrivial invariant subspace would be one-dimensional (spanned by any $Tx$ that is not $0$), an eigenspace with eigenvector $0$, and there can be only one because $T$ is not $0$. E.g., think of $\begin{bmatrix}0&1\\0&0\end{bmatrix}$.

  • 0
    Yep, that will do it. I had a reference which appeared to be saying $(1)\Leftrightarrow(2)$, but on rereading more carefully, they were only saying $(1)\Rightarrow(2)$.2017-01-31
  • 0
    BTW @Jonas do you know what is usually meant by a decomposable linear map? I thought it was (1) but the reference in my previous comment defined it as (2) - hence my belief that they must be the same.2017-01-31
  • 0
    I would guess (1) usually but would check definitions in whatever context I'm in. Sometimes there's a distinction between "irreducible" and "indecomposable" and I don't assume which is which before seeing a definition or clarifying context (that might just be me, or might be due to inconsistencies I've seen).2017-01-31
1

Let $V=\mathbb R^2$ , $T$ given by the matrix $\begin{pmatrix}1 & 1 \\ 0 & 1 \end{pmatrix}$ and $U_1=ker(T-I)$ ( = linear span of $(1,0)$). Then $U_1$ is $T$ - invariant.

Now suppose that $V=U_1\oplus U_2$ with $U_2$ is $T$ - invariant.

Its your turn to show that then there is an eigenvalue $\mu$ of $T$, with $\mu \ne 1$, a contradiction !

  • 0
    Thanks Fred, but that's effectively the same as Jonas' answer so first in best dressed :)2017-01-31