0
$\begingroup$

I asked the same question yesterday, but this one is a bit different in terms of computations. It is from my exam I took an hour ago.

For what $h$ the columns of this matrix are linearly dependent? $$\begin{bmatrix} 1 & -3 & 4 \\ -4 & 7 & h\\ 2 & -6 & 8 \end{bmatrix}$$

Attempt: after row reducing, but not completely: $$\begin{bmatrix} 1 & -3 & 4 & 0 \\ -4 & 7 & h & 0 \\ 2 & -6 & 8 & 0 \end{bmatrix} \sim \begin{bmatrix} 1 & -3 & 4 & 0 \\ 0 & -5 & h+16 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} $$

My guess was that if $h=-16;-\frac{28}{3}$ the system is linearly dependent. And I just guessed the -16. Hints please.

  • 0
    Hint: the last row is all zeroes, independently of the value of $h$.2012-07-19
  • 0
    Yeah so $x_3$ can be anything. But the vector that contains h should be collinear to at least one other vector. So, h should be some number(s) that lets its vector be a scalar multiple of another vector. Thats how i see it.2012-07-19
  • 4
    In fact, this is the *same* as your previous question (in spirit, if not in details): since the third row is a multiple of the first row, it doesn't matter what $h$ is, the matrix is not full rank. By the way: you did not give a **set**, you gave a matrix; and a matrix is neither linearly dependent nor independent. Presumably, you are refering to the rows or columns of the matrix in some way?2012-07-19
  • 0
    The columns of the matrix are the set of vectors i am reffering to in my question.2012-07-19
  • 1
    In a square matrix, the rows are linearly independent if and only if the rows are linearly independent. This is not obvious, but it is nonetheless true: it follows from the fact that in *any* matrix, the dimension of the rowspace is the same as the dimension of the columnspace. But if you are interested in the linear independence of the *columns*, why are you looking at the extended matrix? It's not relevant.2012-07-19
  • 0
    @Dostre: You are incorrect in implicitly saying that the three vectors are linearly dependent only if the third vector "is collinear to at least one other vector". It is true that a set with **two** vectors is linearly dependent if and only if one of them is a multiple of the other, but when you get to three or more this is **not** true anymore. The three vectors are linearly independent if and only if one of them lies in the *plane* spanned by the other two.2012-07-19
  • 0
    @ArturoMagidin thats i cannot grasp. By the definition of linear dependence there should be one non trivial solution for $A\vec x =\vec 0$ in order for $A$ to be linearly dependent. So, after I row reduced this matrix not completely I wrote it down the folowing way:$$x_1\left[ \begin{array}{c} 1\\ 0\\0 \end{array} \right]+x_2 \left[ \begin{array}{c} -3 \\ -5 \\0 \end{array} \right]+x_3\left[ \begin{array}{c} 4 \\ h+16\\0 \end{array} \right] =\vec 0 $$2012-07-19
  • 0
    @ArturoMagidin from there i concluded that $h=-16;-28/3$ when the set of the columns of A is linearly dependent.2012-07-19
  • 0
    @Dostre: I don't see how you conclude that. Why only those two values? You do realize that there are *infinitely* many ways in which $x_1-3x_2$ can equal $4$, and not just $x_1=1$, $x_2=-1$ (which is, I am guessing, where you got $h = -28/3$) ? **Every** value of $h$ gives you a solution. You have a **free variable** in this system, there are **infinitely** many solutions *always*.2012-07-19
  • 1
    @Dostre: And why -16? Because that makes the entry equal to $0$? So what? That value comes from selecting $x_2=0$; but pick your favorite value of $x_2$, and then you can find a value of $h$ that will make the equation true. Conversely, pick your favorite value for $h$, solve for $x_2$ from $-5x_2 + h + 16 = 0$; and then use the value to solve for $x_1$ from $x_1-3x_2+4=0$ (having selected $x_3=1$): *every* value of $h$ gives you a solution.2012-07-19

3 Answers 3

0

Compute the $det$, the root of $det=0$ is the only $h$ that makes the set linearly dependent.

  • 2
    The first and third rows are the same, so the determinant will be zero for all $h$.2012-07-19
  • 0
    Then for all $h$, the columns are linearly dependent, since column rank = row rank2012-07-19
2

Based on your question from before, let's take a step back and revisit what linear dependence means.

Specifically, a set of vectors is linearly dependent if it is not linearly independent. Well maybe that's not useful, but now we get to talk about linear independence!

A set of vectors is linearly independent if none of the vectors in the set can be written as a linear combination of finitely many other vectors in the set. So, if you can write one of the vectors, say, $v_2$ as a combination of some finite number of other vectors, then your set is linearly dependent, e.g. $v_2 = 2v_1 + 5v_6$.

Rather than trying to compute things through row-reduction, which I suspect is confusing you, use the definition of linear independence to find a value of $h$ that makes the set not linearly independent.

2

You are asking: for what values of $h$ are the vectors $$\vec{v_1}=\left(\begin{array}{r}1\\-4\\2\end{array}\right),\quad \vec{v_2}=\left(\begin{array}{r}-3\\7\\-6\end{array}\right),\quad \vec{v_3}=\left(\begin{array}{r}4\\h\\8\end{array}\right)$$ linearly dependent?

You seem to be trying to do this by looking at the equation $$\alpha\vec{v_1}+\beta\vec{v_2}+\gamma\vec{v_3}=\left(\begin{array}{c}0\\0\\0\end{array}\right)$$ and trying to determine for what values of $h$ there is a nonzero solution. This leads to the matrix you have: $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ -4 & 7 & h & 0\\ 2 & -6 & 8 & 0 \end{array}\right).$$ Now, since the third equation is a multiple of the first, that equation does not matter: it provides no new information. That means that you have a homogeneous system of two equations in three unknowns. Those systems always have infinitely many solutions. In particular, no matter what $h$ is, the system has infinitely many solutions, and so must have a nontrivial solution. Thus, the vectors are always linearly dependent.

To understand what is happening, note that all three vectors lie in the plane $z=2x$. Any two vectors on the plane that are not collinear will span the plane. Since $\vec{v_1}$ and $\vec{v_2}$ are not collinear, and both lie on the plane $z=2x$, any vector that lies on the plane $z=2x$ will be a linear combination of $\vec{v_1}$ and $\vec{v_2}$. Or, put another way, three vectors in a $2$-dimensional space (a plane through the origin) are always linearly dependent.

Here you have three vectors that satisfies $z=2x$; every other vector that satisfies that is a linear combination of $\vec{v_1}$ and $\vec{v_2}$: if $(a,b,2a)^t$ lies in the plane, then the system $$\alpha\left(\begin{array}{r}1\\-4\\2\end{array}\right) + \beta\left(\begin{array}{r}-3\\7\\-6\end{array}\right) = \left(\begin{array}{c}a\\b\\2a\end{array}\right)$$ has a solution, namely $\alpha = -\frac{7a+3b}{5}$, $\beta=-\frac{4a+b}{5}$ (obtained by Gaussian elimination). In particular, since no matter what $h$ is $\vec{v_3}$ lies in the plane $2z=x$, then we will have $$\vec{v_3} = -\frac{28+3h}{5}\vec{v_1} - \frac{16+h}{5}\vec{v_2}.$$ Note that this makes sense no matter what $h$ is.

This can be read off your row-reduced matrix: you got $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & -5 & h+16 & 0\\ 0 & 0 & 0 & 0 \end{array}\right).$$ Divide the second row by $-5$ to get $$\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right),$$ and now add three times the second row to the first row to get $$\left(\begin{array}{rrc|c} 1 & 0 & 4+\frac{-3h-48}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right) = \left(\begin{array}{rrc|c} 1 & 0 & -\frac{28+3h}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0& 0 \end{array}\right).$$ So $\alpha$ and $\beta$ are leading variables, and $\gamma$ is a free variable. This tells you that the solutions to the original system are: $$\begin{align*} \alpha &= \frac{28+3h}{5}t\\ \beta &= \frac{h+16}{5}t\\ \gamma &= t \end{align*}$$ Any nonzero value of $t$ gives you a nontrivial solution, and $t=-1$ gives you the solution I give above.

Of course, this can be done much more simply noting that since your original matrix has linearly dependent rows (third row is a scalar multiple of the first row), then the dimension of the rowspace is at most $2$ (in fact, exactly $2$), and hence the dimension of the columnspace is at most $2$ (in fact, exactly $2$, since $\dim(\text{columnspace})=\dim(\text{rowspace})$, so the columns are always linearly dependent.