6
$\begingroup$

This is a problem from an olympiad I took today. I tried but couldn't solve it.

Let $A$ and $B$ rectangular matrices with real entries, of dimensions $k\times n$ and $m\times n$ respectively. Prove that if for every $n\times l$ matrix $X$ ($l\in \mathbb{Z}^+$) the equality $AX=0$ implies $BX=0$, then there exists a matrix $C$ such that $B=CA$.

I tried to define a commutative diagram, but failed. Anyhow, I'd prefer a solution that works with the matrices explicitly. But I'll welcome and appreciate any solutions.

2 Answers 2

3

$AX=0$ means that the rows of $A$ are all orthogonal to the columns of $X$, and likewise for $BX=0$. $B=CA$ means that the rows of $B$ are linear combinations of the rows of $A$.

Assume that there is a row of $B$ that is not a linear combination of the rows of $A$. Orthogonalize that row to the rows of $A$, and choose $X$ as the column vector corresponding to this orthogonalized row. Then $AX=0$, but $BX\neq0$. Hence the rows of $B$ are linear combinations of the rows of $A$.

  • 0
    @joriki: $y$ou're ri$g$ht. I was looking at it the wrong wa$y$. Thanks!2011-03-04
2

Think of $A$ as a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^k$, and $B$ as a linear transformation from $\mathbb{R}^n$ to $\mathbb{R}^m$. White $L_M$ for the linear transformation defined by multiplication by the matrix $M$.

For the matrix $C$ to exist, you must have a linear transformation $L$ from $\mathbb{R}^k$ to $\mathbb{R}^m$ such that $L\circ L_A = L_B$. Then you can define $C$ to be the standard matrix representation of $L$.

A necessary condition for this to be possible is that $\mathbf{N}(L_A) \subseteq \mathbf{N}(L_B)$ (if there is $\mathbf{v}\in\mathbf{N}(L_A)$ that is not in $\mathbf{N}(L_B)$, then $L\circ L_A(\mathbf{v})=\mathbf{0}\neq L_B(\mathbf{v})$ and you're sunk).

In fact, this is sufficient: find a basis $\mathbf{v}_1,\ldots,\mathbf{v}_s$ for the nullspace of $A$, and then extend it to a basis for $\mathbb{R}^n$, $\mathbf{v}_{s+1},\dots,\mathbf{v}_n$. Note that the image of $\mathbf{v}_{s+1},\ldots,\mathbf{v}_n$ in $\mathbb{R}^k$ under $L_A$ is linearly independent (this is essentially the Rank-Nullity Theorem), so you can complete $A\mathbf{v}_{s+1},\ldots,A\mathbf{v}_n$ to a basis $\mathbf{w}_1,\ldots,\mathbf{w}_{(k-n+s)}$ of $\mathbb{R}^k$. Define $L\colon\mathbb{R}^k\to\mathbb{R}^n$ by letting $L(A\mathbf{v}_j) = B\mathbf{v}_j$ for $j=s+1,\ldots,n$, and arbitrarily for $\mathbf{w}_i$. Then let $C$ be the standard matrix for this linear transformation.

  • 0
    @PEV: Yes, thanks.2011-02-05