2
$\begingroup$

Let $T: \mathbb{R}^3 \rightarrow \mathbb{R}^3 $ be the linear operator

$$ A = \begin{pmatrix} 1 & 0 & 0 \\ 1 & 2 & 0 \\ 0 & 0 & 3 \end{pmatrix} $$

The problem I am trying to understand is the following.

True or False? If $W$ is a $T$-invariant subspace of $\mathbb{R}^3$ $\exists T$-invariant subspace $W'$ of $\mathbb{R}^3$ such that $W \oplus W' = \mathbb{R}^3$. I think the answer is true and I will list my ideas below but I think there must be an easier way to do approach this.

Since $A$ has distinct eigenvalues $1,2,3$ we see the minimal polynomial is $m_A(x) = (x-1)(x-2)(x-3)$

and therefore using the fundamental structure theorem for PID's we have $\mathbb{R}^3 = \mathbb{R}/(x-1)\oplus\mathbb{R}/(x-2)\oplus \mathbb{R}/(x-3) = \mathbb{R}\oplus\mathbb{R}\oplus\mathbb{R} $

From this calculation does it follow that $W' = \mathbb{R}\oplus\mathbb{R}$ is the invariant subspace we are looking for?

Is there a way to do this problem without using the fundamental structure theorem for modules over a PID?

  • 1
    I think you can use Lagrange’s Interpolation Polynomial.2011-08-02
  • 1
    Your decomposition of $\mathbf{R}^3$ into a direct sum of 3 pairwise non-isomorphic simple $\mathbf{R}[T]$-submodules shows that there are several $T$-invariant subspaces. Some of them are 1-dimensional, some are 2-dimensional. I do think that you are nearly there. Have you covered *semisimple* modules in your course?2011-08-02
  • 1
    In the case of torsion modules, the fundamental structure theorem for modules over a PID reduces to the Chinese Remainder Theorem.2011-08-02
  • 0
    @Jyrki Lahtonen - No we have not covered semisimple modules but from my argument is it clear that $W'=\mathbb{R}\oplus{R}$ or $W'=\mathbb{R}$2011-08-02
  • 1
    Yes. But I would call them $\mathbf{R}_1, \mathbf{R}_2$ and $\mathbf{R}_3$ because, A) they are distinct spaces with clear identities, B) they are non-isomorphic as $\mathbf{R}[T]$-modules. So there are 3 different things you seem to call $\mathbf{R}$, and 3 different things that you call $\mathbf{R}\oplus\mathbf{R}$. Altogether 6 (non-trivial) choices for $W$. Finding the corresponding $W'$ in each case is straightforward, but in a first (or second) course on modules over PIDs to get full marks you need to show that you are aware of this.2011-08-02
  • 1
    For example, if two of the 1-dimensional submodules were isomorphic, there would be infinitely many submodules, and you need to do a little bit of work (not much, but some) to show that a complementary space can always be found. Also, if the space would not split into a direct sum of simple modules things like Dennis Gulko's example appear.2011-08-02
  • 0
    @Jyrki Why are $\mathbb{R}_1$, $\mathbb{R}_2$ abd $\mathbb{R}_3$ non-isomorphic as $\mathbb{R}[T]$-modules?2011-08-02
  • 0
    The element $T$ acts on them as multiplication by respective constants $1,2,3$. An isomorphism of $\mathbf{R}[T]$-modules needs to respect that. The proof is left as an exercise.2011-08-02
  • 0
    And another way to see the same thing is that the irreducible polynomials $T-1$, $T-2$ and $T-3$ annihilate them. So this is nothing but the same argument that shows there are no non-trivial homomorphisms between the $\mathbf{Z}$-modules $\mathbf{Z}/2\mathbf{Z}$,$\mathbf{Z}/3\mathbf{Z}$, and $\mathbf{Z}/5\mathbf{Z}$2011-08-02

3 Answers 3

3

I am not used to your notation, but I think I understand what you mean. If a linear transformation $T$ on an $n$-dimensional vectors space $V$ has $n$ distinct eigenvalues, say $\{\lambda_1, \lambda_2,\ldots, \lambda_n \}$, then $V$ has a basis consisting of eigenvectors of $T$. Most proofs of this fact make (at least) implicit use of the fact that the $n$-polynomials $ \{p_i(x): 1 \leq i \leq n \}$, where $p_i(x)= \prod_{j \neq i}(x- \lambda_j)$ have greatest common divisor one. Probably the way to disguise this use as far as possible is to say that the minimum polynomial divides the characteristic polynomial, while whenever $\lambda$ is an eigenvalue of a linear transformation, $x - \lambda$ is a factor of its minimum polynomial. Hence when there are $n$ distinct eigenvalues, as here, the minimum polynomial must be $\prod_{i=1}^{n}(x-\lambda_i)$. This means that, with the earlier notation, $p_i(T) \neq 0$ for $1 \leq i \leq n.$ On the other hand, $(T- \lambda_i)p_i(T)= [0]$ for each $i$. Thus for each $i$, $p_i(T)V$ is a non-zero subspace of $V$ consisting of eigenvectors for $T$ associated to eigenvalue $\lambda_i$. Since eigenvectors associated to different eigenvalues are linearly independent, this produces a basis of eigenvectors of $T$. Let $V_i$ be the $1$-dimensional $\lambda_i$-eigenspace for $T$ on $V$. Then we have $V = \oplus_{i=1}^{n}V_i$.

If $W$ is a $T$-invariant subspace of $V$, then for some subset $I$ of $\{1,2,\ldots, n \}$, the minimum polynomial of $T$ on $W$ is $\prod_{i \in I}(x-\lambda_i)$ (it must divide the minimum polynomial on $V$. Then for each $i \in I$, arguing as above, $W$ must contain the $\lambda_i$-eigenspace of $T$, as $W$ does contain an eigector with that eigenvalue, but the eigenspace in $V$ is $1$-dimensional. Hence $W = \oplus_{i \in I}V_i$. Thus $V = W \oplus W'$, where $W' = \oplus_{i \in I'} V_i$ and $I' = \{1,2,\ldots, n\} \backslash I$.

2

The answer is true for the given matrix $A$, since it is semi-simple (diagonalizable), as Geoff explained in his answer, but this is not true in general: for example, if you take $$B = \begin{pmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$$ Then $W=Span\{(0,1,0),(0,0,1)\}$ is $B$-invariant, but there is no $B$-invariant subspace $W'$ such that $W\oplus W'=\mathbb{R}^3$.

1

Let $a$ be an endomorphism of a finite dimensional vector space $V$ over a field $K$. Assume that the eigenvalues of $a$ are in $K$. Let $\lambda$ be such an eigenvalue, let $f\in K[X]$ be the minimal polynomial, and let $d$ its degree. Then there is a unique polynomial $g_\lambda$ of degree $ < d$ such that $g_\lambda(a)$ is the projector onto the generalized $\lambda$ eigenspace. Moreover $g_\lambda$ is determined by the congruences $$g_\lambda\equiv\delta_{\lambda,\mu}\ \bmod\ (X-\mu)^{m(\mu)}$$ for all eigenvalues $\mu$, where $m(\mu)$ is the multiplicity of $\mu$ as a root of $f$, and where $\delta$ is the Kronecker symbol. If $K$ is of characteristic $0$, these congruences can be solved by Taylor's Formula.

EDIT. We have $$g_\lambda=T_\lambda\left(\frac{(X-\lambda)^{m(\lambda)}}{f}\right)\frac{f}{(X-\lambda)^{m(\lambda)}}\quad,$$ where $T_\lambda$ means "degree $ < m(\lambda)$ Taylor polynomial at $X=\lambda$".

In particular, if, as in your case, all the $m(\lambda)$ are equal to one, we get $$g_\lambda=\prod_{\mu\not=\lambda}\ \frac{X-\mu}{\lambda-\mu}\quad.$$