1
$\begingroup$

Suppose we have two sets of linearly independent vectors: $$\{\vec{u_1},\ldots,\vec{u_k}\},\{\vec{v_1},\ldots,\vec{v_{n-k}}\}$$ such that they span two mutually orthogonal hyper-planes. (i.e. $\vec{u_i}\cdot\vec{v_j}=0$ for all $i,j$) and also span $\mathbb{R}^n$

Then we can write any $\vec{x}\in\mathbb{R}^n$ with a change of basis $$\vec{x}=a_1\vec{u_1}+\cdots+a_k\vec{u_k}+b_1\vec{v_1}+\cdots+b_{n-k}\vec{v_{n-k}}$$

Or equivalently $$\vec{x}=U\vec{a}+V\vec{b}$$ where $U$ is $n\times k$, $V$ is $n\times n-k$, $\vec{a}=(a_1,\ldots,a_k)$, and $\vec{b}=(b_1,\ldots,b_{n-k})$

My question is given a full-rank linear transformation $T$ on $\mathbb{R}^n$, can $T$ always be split into two transformations on the hyper-planes spanned by $\{\vec{u_1},\ldots,\vec{u_k}\}$ and $\{\vec{v_1},\ldots,\vec{v_{n-k}}\}$? Specifically will there exist transformations $S, T$ such that $$T\vec{x}=U(R\vec{a})+V(S\vec{b})$$

Also will $S,T$ be linear? and how could you find them given an actual matrix?

Maybe for an example? $$ T= \begin{bmatrix} 0 & 0 & 0 & \frac{1}{2} \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 1 \end{bmatrix}$$ Where the hyper-planes are respectively the one spanned by the eigenvectors and that which is all vectors orthogonal to the eigenvectors. $$\{\vec{u_1}, \vec{u_2}\}\approx\{(.40,.32,.25,1),(-.75,1.12,-1.67,1)\}$$

1 Answers 1

1

If I understand your question correctly (let me know otherwise), you are asking whether, given a basis $v_1,\ldots,v_n,w_1,\ldots,w_m$ with $v_i\cdot w_j=0$ for every $i,j$, and a linear transformation $T:U\to U,$ we can decompose $T$ as a sum $T=T_1\oplus T_2$, so that $T(u+w)=T_1u+T_2w$ for every $u\in V=\text{span}\{v_1,\ldots,v_n\}$ and $w\in W=\text{span}\{w_1,\ldots,w_m\}$.

The answer is then no. Here is the classic example. Consider the two dimensional vector space over $\mathbb R$ with basis $(0,1),(1,0)$, then the linear transformation $T=\begin{bmatrix}1&1\\0&1\end{bmatrix}$ is not diagonalizable, so does not decompose as a sum $T_1\oplus T_2$.

Edit: In order for $T$ to be expressed as a direct sum of linear transformations on $V$ and $W$, it must be invariant on $V$ and $W$, ie. $Tv\in V$ and $Tw\in W$ for every $v\in V$, $w\in W$. For example, the linear transformation $T=\begin{bmatrix}1&1&0&0\\0&5&0&0\\0&0&1&2\\0&0&3&4\end{bmatrix}$ has the invariant subspaces $V=\text{span}\{(1,0,0,0),(0,1,0,0)\}$ and $W=\text{span}\{(0,0,1,0),(0,0,0,1)\}$ (orthogonal to each other with the standard inner product). In particular, $T$ decomposes if and only if its matrix is made up of matrix blocks along the diagonal, ie. $T=\begin{bmatrix}A&0\\0&B\end{bmatrix}$.

  • 0
    Being diagonalizable is equivalent to being expressible as $T_1$ and $T_2$?2017-01-28
  • 0
    @ChristianWoll If the vector space is two dimensional, yes, but not in general. See my edit.2017-01-28
  • 0
    there should be $v_i\cdot w_j$ in the second line.2017-01-30