6
$\begingroup$

This is a problem from an olympiad I took today. I tried but couldn't solve it.

Let $A$ and $B$ rectangular matrices with real entries, of dimensions $k\times n$ and $m\times n$ respectively. Prove that if for every $n\times l$ matrix $X$ ($l\in \mathbb{Z}^+$) the equality $AX=0$ implies $BX=0$, then there exists a matrix $C$ such that $B=CA$.

I tried to define a commutative diagram, but failed. Anyhow, I'd prefer a solution that works with the matrices explicitly. But I'll welcome and appreciate any solutions.

2 Answers 2

3

$AX=0$ means that the rows of $A$ are all orthogonal to the columns of $X$, and likewise for $BX=0$. $B=CA$ means that the rows of $B$ are linear combinations of the rows of $A$.

Assume that there is a row of $B$ that is not a linear combination of the rows of $A$. Orthogonalize that row to the rows of $A$, and choose $X$ as the column vector corresponding to this orthogonalized row. Then $AX=0$, but $BX\neq0$. Hence the rows of $B$ are linear combinations of the rows of $A$.

  • 0
    Sorry, what do you mean by "orthogonalize that row to the rows of $A$" ?2011-02-05
  • 0
    @joriki: I'm not really grasping your solution, could you elaborate a little? Specially the last 2 lines.2011-02-05
  • 1
    The rows of A span a vector space R (of row vectors). Every row vector b can be written as the sum of an element b_r of R and an element b_c of its orthogonal complement. If a row b of B is not a linear combination of the rows of A and you write it in this form, then b_c will necessarily be non-zero. By orthogonalizing to the rows of A, I mean dropping b_r and keeping only b_c. This is orthogonal to all rows of A but not orthogonal to b, since b_c * b = b_c * (b_r + b_c) = b_c * b_c != 0. Thus if you choose X as the corresponding column vector, AX = 0, but BX != 0. Hope that makes it clearer?2011-02-05
  • 0
    @joriki: I think I get it now.2011-02-07
  • 0
    @joriki: aren't the rows of $B$ linear combinations of the *columns* of A?2011-03-02
  • 0
    @Weltschmerz: I don't think so -- you can see that from the dimensions of $A$ and $B$, which are $k\times n$ and $m\times n$, respectively. Only the column counts $n$ coincide, and the column count is the length of the rows. But you can also see it in the actual matrix multiplication, $B_{ik}=\sum_j C_{ij}A_{jk}$ -- this adds the row $A_{jk}$ to the row $B_{ik}$ with coefficient $C_{ij}$.2011-03-02
  • 0
    @joriki: you're right. I was looking at it the wrong way. Thanks!2011-03-04
2

Think of $A$ as a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^k$, and $B$ as a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^m$. White $L_M$ for the linear transformation defined by multiplication by the matrix $M$.

For the matrix $C$ to exist, you must have a linear transformation $L$ from $\mathbb{R}^k$ to $\mathbb{R}^m$ such that $L\circ L_A = L_B$. Then you can define $C$ to be the standard matrix representation of $L$.

A necessary condition for this to be possible is that $\mathbf{N}(L_A) \subseteq \mathbf{N}(L_B)$ (if there is $\mathbf{v}\in\mathbf{N}(L_A)$ that is not in $\mathbf{N}(L_B)$, then $L\circ L_A(\mathbf{v})=\mathbf{0}\neq L_B(\mathbf{v})$ and you're sunk).

In fact, this is sufficient: find a basis $\mathbf{v}_1,\ldots,\mathbf{v}_s$ for the nullspace of $A$, and then extend it to a basis for $\mathbb{R}^n$, $\mathbf{v}_{s+1},\dots,\mathbf{v}_n$. Note that the image of $\mathbf{v}_{s+1},\ldots,\mathbf{v}_n$ in $\mathbb{R}^k$ under $L_A$ is linearly independent (this is essentially the Rank-Nullity Theorem), so you can complete $A\mathbf{v}_{s+1},\ldots,A\mathbf{v}_n$ to a basis $\mathbf{w}_1,\ldots,\mathbf{w}_{(k-n+s)}$ of $\mathbb{R}^k$. Define $L\colon\mathbb{R}^k\to\mathbb{R}^n$ by letting $L(A\mathbf{v}_j) = B\mathbf{v}_j$ for $j=s+1,\ldots,n$, and arbitrarily for $\mathbf{w}_i$. Then let $C$ be the standard matrix for this linear transformation.

  • 0
    Shouldn't it be a linear transformation from $\mathbb{R}^k$ to $\mathbb{R}^m$ such that $L_C \circ L_A = L_B$?2011-02-05
  • 0
    @PEV: Yes, thanks.2011-02-05