0
$\begingroup$

Suppose the last column of $AB$ is entirely zero but $B$ itself has no column of zeros. What can you say about the columns of $A$?

Correct answer: The columns of $A$ are linearly dependent.

I don't see the correlation between those two. If it's linearly dependent, there exists a solution where the columns multiplied by a scalar equals 0, but I don't see how a column of all 0s here shows that it's linearly dependent.

  • 0
    The last column of $B$ gives a linear dependence on the *columns* of $A$, since $B$ has *no column* of all zeros.2017-02-21

3 Answers 3

1

Hint: Write out the definition of the matrix product, then try to rewrite the last column of the product as a linear combination of the columns of $A$. You'll find the non-trivial linear combination that results in 0.

  • 0
    (+1) Another way to look at it is that the last column of $AB$ can be written as matrix multiplication between the matrix $A$ and the last column vector of $B$.2017-02-21
0

Let the columns of $B$ be $b_{1},\ldots,b_{n}.$ Then the columns of $AB$ are just $Ab_{1},\ldots,Ab_{n},$ so in particular, each column of $AB$ is just a linear combination of the columns of $A,$ with $Ab_{j}=\sum_{i=1}^{n}a_{i}b_{i,j},$ if we let the columns of $A$ be $a_{1},\ldots, a_{n},$ and the entries of $b_{j}$ be $b_{i,j}, 1\leq i\leq n.$ So if $Ab_{j}=0$ for some $j$ where $b_{j}\neq0,$ we have a nontrivial linear combination of the columns of $A$ which is equal to 0; in other words, the columns of $A$ are linearly dependent.

0

I guess what you saw about matrix multiplication of matrices $A,B$ (for suitable dimensions) is that for example, the element in row 1, column 1 of $AB$ is the product of the first row of $A$ times the first column of $B$. Let us denote a general $n \times m$ matrix $A$ and a general $m \times k$ matrix $B$: \begin{equation} A = \begin{pmatrix} a_{11} & a_{12} & \ldots& a_{1m}\\ a_{21} & a_{22} & \ldots& a_{2m}\\ \vdots & \vdots & & \vdots\\ a_{n1} & a_{n2} & \ldots & a_{nm} \end{pmatrix} \quad \text{and} \quad B = \begin{pmatrix} b_{11} & b_{12} & \ldots& b_{1k}\\ b_{21} & b_{22} & \ldots& b_{2k}\\ \vdots & \vdots & & \vdots\\ b_{m1} & b_{m2} & \ldots & b_{mk} \end{pmatrix}. \end{equation} The result of the matrix multiplication as I your a probably used to would be the following matrix: \begin{equation} AB = \begin{pmatrix} \sum_{i = 1}^m a_{1i}b_{i1} & \sum_{i = 1}^m a_{1i}b_{i2} & \ldots & \sum_{i = 1}^m a_{1i}b_{ik}\\ \sum_{i = 1}^m a_{2i}b_{i1} & \sum_{i = 1}^m a_{2i}b_{i2} & \ldots & \sum_{i = 1}^m a_{2i}b_{ik}\\ \vdots & \vdots & & \vdots\\ \sum_{i = 1}^m a_{ni}b_{i1} & \sum_{i = 1}^m a_{ni}b_{i2} & \ldots & \sum_{i = 1}^m a_{ni}b_{ik} \end{pmatrix}. \end{equation} We see something 'odd' in this result: In the first column of $AB$, we only see $b_{i1}$, for $k = \{1, \ldots m\}$. This is the first column of $B$. Analogous, the second column of $AB$ contains only $b_{i2}$ (which is the second column of $B$ and so on. Let me denote the columns of $A$ as columnvectors $\vec{a}_i = (a_{1i}, a_{2i}, \ldots, a_{ni})^t$: \begin{equation} A = \begin{pmatrix} | & | & & |\\ \vec{a}_1 & \vec{a}_2 & \ldots & \vec{a}_m\\ | & | & & | \end{pmatrix} \end{equation} then our remark and the usual way to look at multiplication leads to the following: \begin{equation} AB = \begin{pmatrix} | & | & & |\\ \sum_{i = 1}^mb_{i1}\vec{a}_i & \sum_{i = 1}^mb_{i1}\vec{a}_i & \ldots & \sum_{i = 1}^mb_{i1}\vec{a}_i\\ | & | & & | \end{pmatrix}. \end{equation} This way of looking at the matrix multiplication is more suitable for this exercise: the $i$th column of $AB$ is the linear combination of the columns of $A$ where the coefficients of this combination are the elements of the $i$th column of $B$. Can you use this remark to solve your problem?

$\textbf{Worth remarking:}$ A similar remark can be made for the rows of $B$ (instead of the columns of $A$) and the product $AB$, this might be nice to try yourself.