2
$\begingroup$

Suppose I have a matrix: $$A \in \mathbb{R}^{n \times m}$$ and another one (same size): $$W \in \mathbb{R}^{n \times m}$$

  1. When is it possible to find a square matrix $L$ such that: $$L\cdot A=W$$ where the "$\cdot$" is the usual matrix multiplication? When have the matrix $L$ real values?
  2. The same as the previous point but with a right-matrix, that solves: $$A\cdot R=W$$
  3. Is it possible to find two square matrices $L'$ and $R'$ such that: $$L' \cdot A \cdot R' = W$$ and, if so, do you think that it will be easer, harder (or not easy to compare) than the decomposition proposed in the previous points?

Thank you very much.

  • 2
    The answer to the question in the title is "no". Have you thought of examples where such $L$ and $R$ do not exist?2012-12-30
  • 0
    @JonasMeyer Do you mean when matrix $A$ is the zero-matrix, and $W$ isn't?2012-12-30
  • 3
    @Aslan986 That's a good example. To think about what the sufficient / necessary conditions are, think about the range space and the null space of these matrices, and see what constraints are needed.2012-12-30
  • 1
    Well, obviously if $A$ is invertible then $L=W\cdot A^{-1}$ and $R=A^{-1}\cdot W$, but there are also solutions if $\operatorname{rank}(W)\leq\operatorname{rank}(A) as well (but a bit harder to describe). I'm not sure, but I don't think that generalization to two matrices $L',R'$ expands the set of solutions. I'd be interested to see a counterexample to my statement, though.2012-12-30
  • 0
    @Mario: Consider $A=\begin{bmatrix}1&0&0\\0&0&0\end{bmatrix}$ and $W=\begin{bmatrix}0&0&0\\0&0&1\end{bmatrix}$.2012-12-30
  • 0
    $A=\begin{bmatrix}1&0\\0&0\end{bmatrix}$ and $W=\begin{bmatrix}0&0\\0&1\end{bmatrix}$ is actually sufficient! As Calvin says, consider both the range and kernel of the matrices, not just the rank.2012-12-30

1 Answers 1

3

This is a complete description for the necessary and sufficient conditions for the matrices to exist. I'd leave 3 for you to do, since it's just combining 1 and 2.

  1. Think about the Kernel. If $v \in \operatorname{Null}(A)$, $Av = 0$, then $0=LA v= Wv $, so $v \in \operatorname{Null}(W)$. Hence, $\operatorname{Null}(A) \subseteq \operatorname{Null}(W)$.

    Conversely, if $\operatorname{Null}(A) \subseteq \operatorname{Null}(W)$, let $\{ v_1, \ldots, v_i, v_{i+1} , \ldots, v_j, v_j, \ldots v_m\}$ be a basis of $\mathbb{R}^m$ such that $\{v_k\}_{k=1}^i$ is a basis for $\operatorname{Null}(A)$, $\{v_k\}_{k=1}^j$ is a basis for $\operatorname{Null}(W)$. Notice that $\{ Av_k \}_{k=i+1}^m$ is a linearly independent set, and so is $\{ Wv_k\} _ {k = j+1}^m$. Define $L$ to be the linear transformation such that $L(Av_k) =0$ for $k = i+1 $ to $j$, $L(Av_k) = W_k $ for $k= j+1$ to $m$, and extend $L$ to $\mathbb{R}^n$ (it doesn't matter how you extend it). Then, $LA = W$ as linear transformation from $\mathbb{R} ^m \rightarrow \mathbb{R}^n$, by checking its action on the basis.

  2. Think about the Range Space. If $w \in \operatorname{Range}(W)$, then there exists $v \in \mathbb{R}^m$ such that $w = W v$, then we have $ A (R v) = w$, and so $w \in \operatorname{Range}(A)$. This shows that $\operatorname{Range}(W) \subseteq \operatorname{Range}(A)$.

    Conversely, if $\operatorname{Range}(W) \subseteq \operatorname{Range}(A)$, let $\{v_1, \ldots, v_i, v_{i+1}, \ldots, v_j, v_{j+1}, \ldots v_m\}$ be a basis of $\mathbb{R}^m$ such that $\{ W v_k\}_{k=1}^i$ is a basis for $\operatorname{Range}(W)$, $\{ A v_k\}_{k=1}^j$ is a basis for $\operatorname{Range}(A)$. Define $R$ to be the linear transformation such that $Rv_k$ is the vector which satisfies $A R v_k = W v_k$ for $k=1$ to $i$ (Why must this exist?), and $Rv_k = 0$ for $k=i+1$ to $m$. Then, $ARv_k = W v_k$ on the basis, hence $AR = W$.

  • 2
    The [markdown formatting](http://stackoverflow.com/editing-help#simple-lists) on the site tries to "help" you create numbered lists. Putting something null after the period, like ` ` (non-breaking space) or `$\,$` (small space in LaTeX) confuses it into not helping.2012-12-30
  • 0
    How do you get 3 by "just combining 1 and 2"? Let $A=\begin{pmatrix}1\\&0\end{pmatrix}$ and $W=I-A$. Then $Null(A)\not\subseteq Null(W)$ and $Range(W)\not\subseteq Range(A)$, but $LAR=W$ for some $L$ and $R$.2012-12-30
  • 0
    @Zev: The better solution is to indent the subsequent paragraphs that are supoosed to be part of the same bullet point, as Mario just did.2012-12-30
  • 0
    @user1551 That is because you didn't apply the problem at all. Treating $AR'=A'$, the existence of $L'$ implies that $Null (AR') \subset Null (W)$, and as such $\dim Ker(A) \leq \dim Ker (W)$. Similarly for $L'A = A'$, we have $\dim Range(A) \geq \dim Range (W)$. [This also follows from Rank-Nullity.] This is the necessary condition. Now show that it is sufficient by constructing $R'$ and $L'$, by defining their action on a suitable basis obtained from $W$ and $A$ respectively.2012-12-30
  • 0
    @RahulNarain Can you explain how to indent the paragraphs? I tried to see your edits, but it doesn't show how to do it.2012-12-30
  • 0
    @Calvin: Mario just added a space at the beginning of the paragraphs, which indicates that they should be formatted as part of the previous bullet point.2012-12-30
  • 0
    @CalvinLin Sigh. What I am saying is that part 3 is the part that obviously requires extra tricks (e.g. the rank-nullity theorem) and the most effort. To answer the two easiest parts of the OP's question and then claim to the OP (who apparently is a novice) that he or she can get the result of the remaining part by "just combining 1 and 2" is just misleading.2012-12-30
  • 0
    @user1551 It doesn't require the rank-nullity theorem. Though it does require an extra trick in going from $Null(AR') \subset Null (W)$ (from initial part 1) to $\dim Ker (A) \leq \dim Ker W$, to determine the suitable condition. To show that it is sufficient, we simply define the action of $L', R'$ on a suitable basis $w_k$, $a_k$, as was later done in both parts.2012-12-30