2
$\begingroup$

Let $T: \mathbb{R}^3 \rightarrow \mathbb{R}^3 $ be the linear operator

$ A = \begin{pmatrix} 1 & 0 & 0 \\ 1 & 2 & 0 \\ 0 & 0 & 3 \end{pmatrix} $

The problem I am trying to understand is the following.

True or False? If $W$ is a $T$-invariant subspace of $\mathbb{R}^3$ $\exists T$-invariant subspace W' of $\mathbb{R}^3$ such that W \oplus W' = \mathbb{R}^3. I think the answer is true and I will list my ideas below but I think there must be an easier way to do approach this.

Since $A$ has distinct eigenvalues $1,2,3$ we see the minimal polynomial is $m_A(x) = (x-1)(x-2)(x-3)$

and therefore using the fundamental structure theorem for PID's we have $\mathbb{R}^3 = \mathbb{R}/(x-1)\oplus\mathbb{R}/(x-2)\oplus \mathbb{R}/(x-3) = \mathbb{R}\oplus\mathbb{R}\oplus\mathbb{R} $

From this calculation does it follow that W' = \mathbb{R}\oplus\mathbb{R} is the invariant subspace we are looking for?

Is there a way to do this problem without using the fundamental structure theorem for modules over a PID?

  • 0
    And another way to see the same thing is that the irreducible polynomials $T-1$, $T-2$ and $T-3$ annihilate them. So this is nothing but the same argument that shows there are no non-trivial homomorphisms between the $\mathbf{Z}$-modules $\mathbf{Z}/2\mathbf{Z}$,$\mathbf{Z}/3\mathbf{Z}$, and $\mathbf{Z}/5\mathbf{Z}$2011-08-02

3 Answers 3

3

I am not used to your notation, but I think I understand what you mean. If a linear transformation $T$ on an $n$-dimensional vectors space $V$ has $n$ distinct eigenvalues, say $\{\lambda_1, \lambda_2,\ldots, \lambda_n \}$, then $V$ has a basis consisting of eigenvectors of $T$. Most proofs of this fact make (at least) implicit use of the fact that the $n$-polynomials $ \{p_i(x): 1 \leq i \leq n \}$, where $p_i(x)= \prod_{j \neq i}(x- \lambda_j)$ have greatest common divisor one. Probably the way to disguise this use as far as possible is to say that the minimum polynomial divides the characteristic polynomial, while whenever $\lambda$ is an eigenvalue of a linear transformation, $x - \lambda$ is a factor of its minimum polynomial. Hence when there are $n$ distinct eigenvalues, as here, the minimum polynomial must be $\prod_{i=1}^{n}(x-\lambda_i)$. This means that, with the earlier notation, $p_i(T) \neq 0$ for $1 \leq i \leq n.$ On the other hand, $(T- \lambda_i)p_i(T)= [0]$ for each $i$. Thus for each $i$, $p_i(T)V$ is a non-zero subspace of $V$ consisting of eigenvectors for $T$ associated to eigenvalue $\lambda_i$. Since eigenvectors associated to different eigenvalues are linearly independent, this produces a basis of eigenvectors of $T$. Let $V_i$ be the $1$-dimensional $\lambda_i$-eigenspace for $T$ on $V$. Then we have $V = \oplus_{i=1}^{n}V_i$.

If $W$ is a $T$-invariant subspace of $V$, then for some subset $I$ of $\{1,2,\ldots, n \}$, the minimum polynomial of $T$ on $W$ is $\prod_{i \in I}(x-\lambda_i)$ (it must divide the minimum polynomial on $V$. Then for each $i \in I$, arguing as above, $W$ must contain the $\lambda_i$-eigenspace of $T$, as $W$ does contain an eigector with that eigenvalue, but the eigenspace in $V$ is $1$-dimensional. Hence $W = \oplus_{i \in I}V_i$. Thus V = W \oplus W', where W' = \oplus_{i \in I'} V_i and I' = \{1,2,\ldots, n\} \backslash I.

2

The answer is true for the given matrix $A$, since it is semi-simple (diagonalizable), as Geoff explained in his answer, but this is not true in general: for example, if you take $B = \begin{pmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$ Then $W=Span\{(0,1,0),(0,0,1)\}$ is $B$-invariant, but there is no $B$-invariant subspace W' such that W\oplus W'=\mathbb{R}^3.

1

Let $a$ be an endomorphism of a finite dimensional vector space $V$ over a field $K$. Assume that the eigenvalues of $a$ are in $K$. Let $\lambda$ be such an eigenvalue, let $f\in K[X]$ be the minimal polynomial, and let $d$ its degree. Then there is a unique polynomial $g_\lambda$ of degree $ < d$ such that $g_\lambda(a)$ is the projector onto the generalized $\lambda$ eigenspace. Moreover $g_\lambda$ is determined by the congruences $g_\lambda\equiv\delta_{\lambda,\mu}\ \bmod\ (X-\mu)^{m(\mu)}$ for all eigenvalues $\mu$, where $m(\mu)$ is the multiplicity of $\mu$ as a root of $f$, and where $\delta$ is the Kronecker symbol. If $K$ is of characteristic $0$, these congruences can be solved by Taylor's Formula.

EDIT. We have $g_\lambda=T_\lambda\left(\frac{(X-\lambda)^{m(\lambda)}}{f}\right)\frac{f}{(X-\lambda)^{m(\lambda)}}\quad,$ where $T_\lambda$ means "degree $ < m(\lambda)$ Taylor polynomial at $X=\lambda$".

In particular, if, as in your case, all the $m(\lambda)$ are equal to one, we get $g_\lambda=\prod_{\mu\not=\lambda}\ \frac{X-\mu}{\lambda-\mu}\quad.$