0
$\begingroup$

$$A= \begin{pmatrix} 1 &2 &3 \\ 0 &6 &4 \\ 0 &3 &2 \end{pmatrix}$$

"Rank is the highest possible number of linearly independent column / line vectors."

(I hope I have translated correctly?)

Now using that, I see that line 2 and 3 are multiples, thus they are linearly dependent. But line 1 isn't. Rank is 1.

That was about line vectors.

Looking at column vectors, there are no linearly dependent vectors, since no multiples. Rank is 3.

But what is the rank now? Shouldn't the number of linearly dependent / independent in both line / column vectors be equal?

So what is the rank of this matrix now? I'm very confused. Or is the definition wrong?

  • 4
    But rows $1$ and $2$ are independent, so the rank is $2$.2017-02-21
  • 3
    "line(s) 2 and 3 are multiples, thus they are linearly dependent. But line 1 isn't." Right, hence... "Rank is 1". ?? Why rank 1?2017-02-21
  • 0
    Look at the comments on your previous question [here](http://math.stackexchange.com/questions/2155095/how-can-i-quickly-know-the-rank-of-this-any-other-matrix)2017-02-21
  • 0
    Understood, thanks everyone.2017-02-21
  • 0
    Ok I have one last question. This trick only works with square matrices right? Because I have another matrix in front of me, a $2 \times 3$ and it doesn't seem to work on it, the thing with multiples.2017-02-21

1 Answers 1

2

First, the conventional nomenclature is to call the horizontal lines "rows" and the vertical lines "columns".

It seems like in the comments you were convinced that there are two linearly independent row vectors and thus the rank is two. This is correct.

However, you also mentioned that there are no two column vectors that are multiples of each other and thus the rank is three. This cannot be the case since the rank theorem says that the column rank is equal to the row rank. The way you can see directly that the three column vectors are not linearly independent is to show that a linear combination of them adds up to zero (or equivalently that a linear combination of two of them adds up to the third). It's not too hard to hunt around for an example since the first column is $(1,0,0)^T.$ We have $$(3,4,2)^T-(2/3)*(2,6,3)^T = (5/3,0,0)^T= (5/3)*(1,0,0)^T.$$ Thus the column rank is less than three and it only takes a little more work to show directly that it's two.

  • 0
    I have one last question. This trick only works with square matrices right? Because I have another matrix in front of me, a $2 \times 3$ and it doesn't seem to work on it, the thing with multiples.2017-02-21
  • 1
    @cnmesr . The trick of checking whether they are multiples is only sufficient when checking the linearly independence of a set of two vectors. If there's more than two, as the columns of this example show, you need to check if you can find a linear combination that adds to zero (which is equivalent to them being multiples if there's only two). Fortunately for a $2\times 3$ there are only two rows. If the two rows are proportional, then it's rank 1. Otherwise it's rank 2. (Unless all entries are zero... then it's rank zero). The rank thm holds for non-square matrices, so column rank is same.2017-02-21
  • 0
    The more we get into detail the more questions I have, sorry. But would it be enough in an exam to just look at either row or column, and then say the rank? Because if both are equal, it should be enough to look at the "visible" (in this case, the rows are visible). Or I really must do for rows AND columns?2017-02-21
  • 0
    @cnmesr I'm not sure what you mean by 'visible' but yes, it's sufficient to directly find either the row or the column rank. The other is the same by the rank theorem.2017-02-21