2
$\begingroup$

Suppose I have a matrix: $A \in \mathbb{R}^{n \times m}$ and another one (same size): $W \in \mathbb{R}^{n \times m}$

  1. When is it possible to find a square matrix $L$ such that: $L\cdot A=W$ where the "$\cdot$" is the usual matrix multiplication? When have the matrix $L$ real values?
  2. The same as the previous point but with a right-matrix, that solves: $A\cdot R=W$
  3. Is it possible to find two square matrices $L'$ and $R'$ such that: $L' \cdot A \cdot R' = W$ and, if so, do you think that it will be easer, harder (or not easy to compare) than the decomposition proposed in the previous points?

Thank you very much.

  • 0
    A=\begin{bmatrix}1&0\\0&0\end{bmatrix} and W=\begin{bmatrix}0&0\\0&1\end{bmatrix} is actually sufficient! As Calvin says, consider both the range and kernel of the matrices, not just the rank.2012-12-30

1 Answers 1

3

This is a complete description for the necessary and sufficient conditions for the matrices to exist. I'd leave 3 for you to do, since it's just combining 1 and 2.

  1. Think about the Kernel. If $v \in \operatorname{Null}(A)$, $Av = 0$, then $0=LA v= Wv $, so $v \in \operatorname{Null}(W)$. Hence, $\operatorname{Null}(A) \subseteq \operatorname{Null}(W)$.

    Conversely, if $\operatorname{Null}(A) \subseteq \operatorname{Null}(W)$, let $\{ v_1, \ldots, v_i, v_{i+1} , \ldots, v_j, v_j, \ldots v_m\}$ be a basis of $\mathbb{R}^m$ such that $\{v_k\}_{k=1}^i$ is a basis for $\operatorname{Null}(A)$, $\{v_k\}_{k=1}^j$ is a basis for $\operatorname{Null}(W)$. Notice that $\{ Av_k \}_{k=i+1}^m$ is a linearly independent set, and so is $\{ Wv_k\} _ {k = j+1}^m$. Define $L$ to be the linear transformation such that $L(Av_k) =0$ for $k = i+1 $ to $j$, $L(Av_k) = W_k $ for $k= j+1$ to $m$, and extend $L$ to $\mathbb{R}^n$ (it doesn't matter how you extend it). Then, $LA = W$ as linear transformation from $\mathbb{R} ^m \rightarrow \mathbb{R}^n$, by checking its action on the basis.

  2. Think about the Range Space. If $w \in \operatorname{Range}(W)$, then there exists $v \in \mathbb{R}^m$ such that $w = W v$, then we have $ A (R v) = w$, and so $w \in \operatorname{Range}(A)$. This shows that $\operatorname{Range}(W) \subseteq \operatorname{Range}(A)$.

    Conversely, if $\operatorname{Range}(W) \subseteq \operatorname{Range}(A)$, let $\{v_1, \ldots, v_i, v_{i+1}, \ldots, v_j, v_{j+1}, \ldots v_m\}$ be a basis of $\mathbb{R}^m$ such that $\{ W v_k\}_{k=1}^i$ is a basis for $\operatorname{Range}(W)$, $\{ A v_k\}_{k=1}^j$ is a basis for $\operatorname{Range}(A)$. Define $R$ to be the linear transformation such that $Rv_k$ is the vector which satisfies $A R v_k = W v_k$ for $k=1$ to $i$ (Why must this exist?), and $Rv_k = 0$ for $k=i+1$ to $m$. Then, $ARv_k = W v_k$ on the basis, hence $AR = W$.

  • 0
    @user1551 It doesn't require the rank-nullity theorem. Though it does require an extra trick in going from $Null(AR') \subset Null (W)$ (from initial part 1) to $\dim Ker (A) \leq \dim Ker W$, to determine the suitable condition. To show that it is sufficient, we simply define the action of $L', R'$ on a suitable basis $w_k$, $a_k$, as was later done in both parts.2012-12-30