3
$\begingroup$

I am studying a lecture notes where I found this result:

Let $A$ be an operator in vector space $V$ . If $V$ is equal to direct sum of its two subspace $U$ and $W$ ie $V = U\bigoplus W$, and if the basis $\{e_1,. . . . .e_n\}$of $V$ is such that the first $k$ vectors $\{e_1,. . . . .e_k\}$ is basis of $U$ and vectors $\{e_{k+1},...,e_n\}$ is a basis of $W$, Then the matrix of $A$ has the following form

$\{\left( \begin{array}{cc} B & 0 \\ 0 & C \\ \end{array} \right)\}$

It has not mentioned anything about matrix $B$ and $C$.

I am finding it difficult to understand this. Could anybody help me with this. Thanks a ton

  • 1
    Ok, i'll do this2012-08-13

2 Answers 2

2

Claim. Assume we are given decomposition $V=U\oplus W$ and a linear operator $A:V\to V$. Let $\mathbf{e}_U=\{e_1,\ldots,e_k\}$ be a basis of $U$ ($U=\mathrm{span}\{e_1,\ldots,e_k\}$), and $\mathbf{e}_W=\{e_{k+1},\ldots,e_n\}$ be a basis of $W$ ($W=\mathrm{span}\{e_{k+1},\ldots,e_n\}$). Then the following conditions are equivalent

1) $A(U)\subset U$, $A(W)\subset W$

2) matrix $[A]$ of operator $A$ in basis $\mathbf{e}_V=\{e_1,\ldots,e_n\}$ is block diagonal.

Proof. $\Longrightarrow$ Let $i\in\{1,\ldots,k\}$ then $e_i\in U$ and $A(e_i)\in A(U)\subset U$. Since $U=\mathrm{span}\{e_1,\ldots,e_k\}$, we have $A(e_i)=\sum_{j=1}^k a_{j,i} e_j$. This means that coordiante reperesentation of vector $A(e_i)$ in basis $\mathbf{e}_V$ is $ [A(e_i)]=\begin{pmatrix}a_{1,i}\\ a_{2,i}\\...\\a_{k,i}\\ 0\\ 0\\ \ldots\\ 0\end{pmatrix}\tag{1} $ Hence $n-k$ last coordinates of $A(e_i)$ in basis $\mathbf{e}_V$ are zeros.

Now let $i\in\{k+1,\ldots,n\}$ then $e_i\in W$ and $A(e_i)\in A(W)\subset W$. Since $W=\mathrm{span}\{e_{k+1},\ldots,e_n\}$, we have $A(e_i)=\sum_{j=k+1}^n a_{j,i} e_j$. This means that coordiante reperesentation of vector $A(e_i)$ in basis $\mathbf{e}_V$ is $ [A(e_i)]=\begin{pmatrix}\\ 0\\ \ldots\\ 0\\ 0\\a_{k+1,i}\\ a_{k+2,i}\\...\\a_{n,i}\end{pmatrix}\tag{2} $ Hence $k$ first coordinates of $A(e_i)$ in basis $\mathbf{e}_V$ are zeros.

Recall that matrix of operator in some basis is a collections of columns $[A(e_i)]$, so the matrix $[A]$ of operator $A$ is $ [A]= \left(\begin{array}{cccc|cccc} a_{1,1} & a_{1,2} & \ldots & a_{1,k} & 0 & 0 & \ldots & 0\\ a_{2,1} & a_{2,2} & \ldots & a_{2,k} & 0 & 0 & \ldots & 0\\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots & \ldots & \ldots\\ a_{k,1} & a_{k,2} & \ldots & a_{k,k} & 0 & 0 & \ldots & 0\\ \hline 0 & 0 & \ldots & 0 & a_{k+1,1} & a_{k+1,2} & \ldots & a_{k+1,n}\\ 0 & 0 & \ldots & 0 & a_{k+2,1} & a_{k+2,2} & \ldots & a_{k+2,n}\\ \ldots & \ldots & \ldots & \ldots & \ldots & \ldots & \ldots & \ldots\\ 0 & 0 & \ldots & 0 & a_{n,1} & a_{n,2} & \ldots & a_{n,n} \end{array}\right)\tag{3} $ This is exactly means that $[A]$ is block diagonal.

$(\Longleftarrow)$ Assume that matrix $[A]$ is block diagonal, i.e. of the form $(3)$.

Let $i\in\{1,\ldots,k\}$, then coordinates of $A(e_i)$ is of the form $(1)$. This equivalent to say that $A(e_i)=\sum_{j=1}^k a_{j,i} e_j\in\mathrm{span}\{e_1,\ldots,e_k\}=U$. Take arbitrary $x\in U$, then $x=\sum_{j=1}^k x_j e_j$, then $A(x)=\sum\limits_{j=1}^k x_j A(e_i)\in\mathrm{span}\{A(e_1),\ldots,A(e_k)\}\subset\mathrm{span}\{e_1,\ldots,e_k\}=U$. Since for all $x\in U$ we have $A(x)\in U$ we conclude $A(U)\subset U$.

Let $i\in\{k+1,\ldots,n\}$, then coordinates of $A(e_i)$ is of the form $(2)$. This equivalent to say that $A(e_i)=\sum_{j=k+1}^n a_{j,i} e_j\in\mathrm{span}\{e_{k+1},\ldots,e_n\}=W$. Take arbitrary $x\in W$, then $x=\sum_{j=k+1}^n x_j e_j$, then $A(x)=\sum\limits_{j=k+1}^n x_j A(e_i)\in\mathrm{span}\{A(e_{k+1}),\ldots,A(e_n)\}\subset\mathrm{span}\{e_{k+1},\ldots,e_n\}=W$. Since for all $x\in W$ we have $A(x)\in W$ we conclude $A(W)\subset W$.

  • 0
    @Norbert Certainly it was not easy to write in this much detail. Thanks a ton.2012-08-13
1

Let me give you an example where you can write your matrix like this: Let $P : \Bbb{R}^3 \to \Bbb{R}^3$ that sends $(x,y,z)$ to $(x,y,0)$. Then the matrix of $P$ in the standard basis of $\Bbb{R}^3$ is

$\left(\begin{array}{cc|c} 1 & 0 & 0 \\ 0 & 1 & 0 \\ \hline 0 & 0 & 0 \end{array}\right).$

This reflects the fact that (1) $\Bbb{R}^3 = xy- \hspace{2mm} \textrm{plane} \oplus z-axis$ and $(2)$, the $x-y$ plane and the $z$ - axis are stable under the action of $p$.

Proof of one direction of Norbert's Claim. Suppose that $V = U \oplus W$ with $A : V \to V$ and the $U,W$ stable under the action of $A$. Then this means that if $e_1,\ldots,e_n$ is a basis for $U$ and $e_{n+1},\ldots,e_m$ is a basis for $W$ that

$A(e_1)$ = linear combinations of the basis vectors of $U$

$\vdots$

$A(e_n)$ = linear combination of the basis vectors of $U$,

$A(e_{n+1})$ = linear combination of the basis vectors of $W$,

$\vdots$

$A(e_m)$ = linear combination of the basis vectors of $W$.

This means that if your matrix of $A$ is in the basis of $e_1,\ldots,e_m$ we must have that the first column which is $A(e_1)$ have entries all zero after the $n-th$ row. Similarly, $A(e_n)$ must have entries all zero after the $n-th$ row.

Similarly we find that $A(e_{n+1})$, the $n+1$ - th column of your matrix have all zeros as its first $n$ entries and a similar pattern is obtained all the way up to $A(e_m)$. If you put all of this together, you get that your matrix will be in block diagonal form.

Can you prove the other direction?

  • 0
    Thank you very much dear. Your answer is very much helpful to me. But I have to accept only one answer. thank you2012-08-13