42
$\begingroup$

The question is from Axler's "Linear Algebra Done Right", which I'm using for self-study.

We are given a linear operator $T$ over a finite dimensional vector space $V$. We have to show that $T$ is a scalar multiple of the identity iff $\forall S \in {\cal L}(V), TS = ST$. Here, ${\cal L}(V)$ denotes the set of all linear operators over $V$.

One direction is easy to prove. If $T$ is a scalar multiple of the identity, then there exists a scalar $a$ such that $Tv = av$, $\forall v \in V$. Hence, given an arbitrary vector $w$, $TS(w) = T(Sw) = a(Sw) = S(aw) = S(Tw) = ST(w)$ where the third equality is possible because $S$ is a linear operator. Then, it follows that $TS = ST$, as required.

I am, however, at a loss as to how to tackle the other direction. I thought that a proof by contradiction, ultimately constructing a linear operator $S$ for which $TS \neq ST$, might be the way to go, but haven't made much progress.

Thanks in advance!

  • 0
    I'd cheat and use the fact that $\mathfrak{gl}_n(k)$ is reductive, and $\mathfrak{gl}_n(k) \cong k \oplus \mathfrak{sl}_n(k)$. $k$ is an abelian ideal so it's contained in the center; $\mathfrak{sl}_n(k)$ is semi-simple so it contains no non-trivial abelian ideals, thus $k$ is indeed the center.2011-03-18

9 Answers 9

0

Let $\{e_1,\dots,e_n\}$ be a basis for the space $V$. Then you need to show that there exists $a \in \mathbb{R}$ such that $Te_i = a e_i$ for $i = 1,\dots,n$ (every linear operator on finite dimensional space is determined by its values on basis vectors).

Let $S_i \in L(V)$ be defined as $S(a_1 e_1 + \dots + a_n e_n) = a_i e_i$. Assume that $Te_i = b_{i,1} e_1 + \dots + b_{i,n} e_n$. Then $ T e_i = TS_i e_i = S_i Te_i = b_{i,1} S_i e_1 + \dots + b_{i,n} S_i e_n = b_{i,i} e_i. $ Now we need to show that $b_{i,i} = b_{j,j}$ for all $i,j=1,\dots,n$. For a given $i,j$ let $S \in L(V)$ be defined by $ S(a_1 e_1 + \dots + a_n e_n) = a_j e_i + a_i e_j. $ Then $ b_{i,i} e_i + b_{j,j} e_j = T(e_i + e_j) = TS(e_i + e_j) = ST(e_i + e_j) = S(b_{i,i} e_i + b_{j,j} e_j) = b_{j,j} e_i + b_{i,i} e_j. $ Hence, because $(e_k)$ form basis, we obtain $b_{i,i} = b_{j,j}$ which completes the proof.

45

For a basis-free answer, consider $S \in L(V)$ given by $S x = f(x) v$ for some vector $v$ and some linear functional $f$ on V. Then $T S x = f(x) T v = S T x = f(T x) v$ for any x. In particular, as long as a nontrivial linear functional $f$ on $V$ exists, there is $x$ such that $f(x) \ne 0$, and then $T v = \alpha v$ for all $v$, where $\alpha = f(T x)/f(x)$. This works even for infinite-dimensional spaces, although I think in general you need the Axiom of Choice to get a nontrivial linear functional on a vector space.

  • 4
    I like this, but it somewhat disturbs me that in the beginning a vector $v$ is chosen, and later $Tv=\alpha v$ is concluded for _all_ $v$. I can see that the proof is OK, but I think the presentation could make the order of things more evident.2013-10-30
21

Suppose $TS = ST$ for every $S$. Show that $Tv = a_{v}v$ for every $v\in V$ where $a_v$ could depend on $v$. In other words, show that $v$ and $Tv$ are linearly dependent for each $v \in V$.

Suppose for contradiction that they are linearly independent. Since $(v, Tv)$ is linearly independent, it can be extended to a basis $(v,Tv, u_1, \dots, u_n)$ of $V$. So define $S$ as following: $Sv = v$, $S(Tv) = v$ and $S(u_1) = 0, \dots, S(u_n) = 0$. Then, $Tv = TSv = STv = v$. Hence $v$ and $Tv$ are linearly dependent, which is a contradiction. Then you have to show uniqueness.

  • 3
    Thanks! Uniqueness can be shown by taking$a$basis $(v_{1},...,v_{n})$ and considering that $Tv = T(a_{1} v_{1}+...+a_{n}v_{n}) = a_{1}b_{1}v_{1}+...a_{n}b_{n}v_{n}$ but also $Tv = kv = k(a_{1} v_{1}+...+a_{n}v_{n}) = a_{1}kv_{1}+...+a_{n}kv_{n}$ Since Tv can be obtained in a unique way as a linear combination of said basis, then it follows that all the $b_i$ are equal to $k$. Hence, $Tv_{i} = kv_{i}$ and so $Tv = kv$, $\forall v$.2011-03-18
11

In general, when one has a condition of the form "$A$ is a blah if and only if for every $B$ this happens", the "if" direction can often be established by selecting suitably/cleverly chosen $B$ that show everything works.

This is just such a situation.

Let $\beta = \{\mathbf{v}_i\}_{i\in I}$ be a basis for $\mathbf{V}$. For each $i,j\in I$, let $S_{ij}$ be the linear operator on $\mathbf{V}$ given by $S_{ij}(\mathbf{v}_k) = \left\{\begin{array}{ll} \mathbf{v}_j &\mbox{if $k=i$,}\\ \mathbf{v}_i &\mbox{if $k=j$,}\\ \mathbf{0} &\mbox{if $k\neq i$ and $k\neq j$.} \end{array}\right.$ That is: for $i\neq j$, $S_{ij}$ exchanges $\mathbf{v}_i$ and $\mathbf{v}_j$, and maps all other basis elements to $\mathbf{0}$. And $S_{ii}$ maps $\mathbf{v}_i$ to itself, and all other basis elements to $\mathbf{0}$. These are our "suitably chosen" $S$.

Now consider $S_{ii}T(\mathbf{v}_j)$ and $TS_{ii}(\mathbf{v}_j)$ first to get information about what $T$ does to $\beta$; then consider $S_{ij}T(\mathbf{v}_j)$ and $TS_{ij}(\mathbf{v}_j)$ for $i\neq j$.

8

One demonstration of your "other direction":

If $T$ and $S$ are two operators that comute, then $S(\mathrm{Ker}T)\leqslant \mathrm{Ker}T$. In fact, $v \in \mathrm{Ker}T \Rightarrow T(Sv)=S(Tv)=S(0)=0$ In words, $\mathrm{Ker}T$ is invariant under $S$.

So, in our case we have that $\mathrm{Ker}T$ is invariant under any linear transformation in $L(V)$. This implies that $\mathrm{Ker}T = V$ or $0$. In fact, in other cases we would have $S\in L(V)$ such that $S(\mathrm{Ker}T)\nsubseteq\mathrm{Ker}T$.

We now show that $T$ has an eigenvalue. In fact, let $S \in L(V)$ be a projection on a non-zero one-dimensional subspace $\langle v \rangle$. Since, $Tv=T(Sv)=S(Tv)$ we have that $Tv \in \langle v\rangle$. Equivalently, $Tv=\lambda v$ for some $\lambda \in \mathbb K$.

Since $TS=ST \iff (T-\lambda I)S=S(T-\lambda I)$ for any $S \in L(V)$, we have as above that $\mathrm{Ker}(T-\lambda I)=V$ or $\mathrm{Ker}(T-\lambda I)=0.$ But, since $\lambda$ is an eigenvalue, $\mathrm{Ker}(T-\lambda I)\neq 0$. Therefore, $T=\lambda I$. QED.

This result is a special case of the Schur Lemma wich states: "If $T$ is an operator in $V$ with an eigenvalue $\lambda \in \mathbb K$ and $C \subseteq L(V)$ is a set of operators such that $TS=ST \forall S \in C$ and for each non-trivial subspace $W$ there is some $S \in C$ such that $S(W) \nsubseteq W,$ then we must have $T=\lambda I$". And whose demonstration is essentialy as above.

5

Perhaps somewhat against the spirit of Axler's book "linear maps over matrices" (although quite conceptual):

Suppose T commutes with all S. Then in particular it commutes with all invertible S: so $T=STS^{-1}$ for all invertible S. But this means the matrix of T is the same, no matter what basis we choose!

Then it must be diagonal: for fixed $j$, replace basis vector $e_j$ with $2e_j$; then if $i\neq j$, we get $t_{ij}=2t_{ij}$, so $t_{ij}=0$.

Edit[elaboration on the previous line]: Suppose $T$ has matrix $(t_{ij})_{ij}$ w.r.t. the basis $\{e_1,...,e_n\}$. Fix $k$, and consider the basis $B_k=\{v_1,..,v_n\}$ where $v_i=e_i$ if $i\neq k$ and $v_k=2e_k$. Then, for $i\neq k$, the matrix of $T$ w.r.t. $B_k$ has $2t_{ik}$ at entry $i,k$. Hence $t_{ik}=2t_{ik}$ and consequently $2t_{ik}=0$. [End edit.]

Then also all diagonal entries are the same: for fixed $i$ and $j$, interchange $e_j$ and $e_i$, and get $t_{ii}=t_{jj}$.

  • 0
    Yes. I made an edit to make the argument clearer.2011-03-19
1

If $v$ is an eigenvector of T with eigenvalue $\lambda$, then $Sv$ is also an eigenvector with the same eigenvalue. For any two $v$ and $w$ in $V$, there exists a transformation $S$ mapping $v$ to $w$; so all elements of $V$ are eigenvectors with eigenvalue $\lambda$.

  • 2
    We aren't necessarily working over an algebraically closed field.2011-03-18
1

Multiples of the identity commutate with all other matrices. Now consider some other matrix A defined by $ A = \begin{bmatrix} a & b \\ c & d \\ \end{bmatrix} $ We see that $ \begin{bmatrix} a & b \\ c & d \\ \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & 0 \\ \end{bmatrix} = \begin{bmatrix} a & b \\ 0 & 0 \\ \end{bmatrix}, $ and $ \begin{bmatrix} 1 & 0 \\ 0 & 0 \\ \end{bmatrix} \begin{bmatrix} a & b \\ c & d \\ \end{bmatrix} = \begin{bmatrix} a & 0 \\ c & 0 \\ \end{bmatrix}. $ For these two matrices to be equal we need $c=b=0$. Doing the same trick with $ \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} $ will give that a=d. So all matrices that commute with these two matrices are multiples of the identity.

  • 3
    This answer is on the right track to a proper full answer, but the general question is not restricted to $2\times 2$ matrices.2016-03-06