0
$\begingroup$

I asked the same question yesterday, but this one is a bit different in terms of computations. It is from my exam I took an hour ago.

For what $h$ the columns of this matrix are linearly dependent? $\begin{bmatrix} 1 & -3 & 4 \\ -4 & 7 & h\\ 2 & -6 & 8 \end{bmatrix}$

Attempt: after row reducing, but not completely: $\begin{bmatrix} 1 & -3 & 4 & 0 \\ -4 & 7 & h & 0 \\ 2 & -6 & 8 & 0 \end{bmatrix} \sim \begin{bmatrix} 1 & -3 & 4 & 0 \\ 0 & -5 & h+16 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} $

My guess was that if $h=-16;-\frac{28}{3}$ the system is linearly dependent. And I just guessed the -16. Hints please.

  • 1
    @Dostre: And why -16? Because that makes the entry equal to $0$? So what? That value comes from selecting $x_2=0$; but pick your favorite value of $x_2$, and then you can find a value of $h$ that will make the equation true. Conversely, pick your favorite value for $h$, solve for $x_2$ from $-5x_2 + h + 16 = 0$; and then use the value to solve for $x_1$ from $x_1-3x_2+4=0$ (having selected $x_3=1$): *every* value of $h$ gives you a solution.2012-07-19

3 Answers 3

0

Compute the $det$, the root of $det=0$ is the only $h$ that makes the set linearly dependent.

  • 0
    Then for all $h$, the columns are linearly dependent, since column rank = row rank2012-07-19
2

Based on your question from before, let's take a step back and revisit what linear dependence means.

Specifically, a set of vectors is linearly dependent if it is not linearly independent. Well maybe that's not useful, but now we get to talk about linear independence!

A set of vectors is linearly independent if none of the vectors in the set can be written as a linear combination of finitely many other vectors in the set. So, if you can write one of the vectors, say, $v_2$ as a combination of some finite number of other vectors, then your set is linearly dependent, e.g. $v_2 = 2v_1 + 5v_6$.

Rather than trying to compute things through row-reduction, which I suspect is confusing you, use the definition of linear independence to find a value of $h$ that makes the set not linearly independent.

2

You are asking: for what values of $h$ are the vectors $\vec{v_1}=\left(\begin{array}{r}1\\-4\\2\end{array}\right),\quad \vec{v_2}=\left(\begin{array}{r}-3\\7\\-6\end{array}\right),\quad \vec{v_3}=\left(\begin{array}{r}4\\h\\8\end{array}\right)$ linearly dependent?

You seem to be trying to do this by looking at the equation $\alpha\vec{v_1}+\beta\vec{v_2}+\gamma\vec{v_3}=\left(\begin{array}{c}0\\0\\0\end{array}\right)$ and trying to determine for what values of $h$ there is a nonzero solution. This leads to the matrix you have: $\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ -4 & 7 & h & 0\\ 2 & -6 & 8 & 0 \end{array}\right).$ Now, since the third equation is a multiple of the first, that equation does not matter: it provides no new information. That means that you have a homogeneous system of two equations in three unknowns. Those systems always have infinitely many solutions. In particular, no matter what $h$ is, the system has infinitely many solutions, and so must have a nontrivial solution. Thus, the vectors are always linearly dependent.

To understand what is happening, note that all three vectors lie in the plane $z=2x$. Any two vectors on the plane that are not collinear will span the plane. Since $\vec{v_1}$ and $\vec{v_2}$ are not collinear, and both lie on the plane $z=2x$, any vector that lies on the plane $z=2x$ will be a linear combination of $\vec{v_1}$ and $\vec{v_2}$. Or, put another way, three vectors in a $2$-dimensional space (a plane through the origin) are always linearly dependent.

Here you have three vectors that satisfies $z=2x$; every other vector that satisfies that is a linear combination of $\vec{v_1}$ and $\vec{v_2}$: if $(a,b,2a)^t$ lies in the plane, then the system $\alpha\left(\begin{array}{r}1\\-4\\2\end{array}\right) + \beta\left(\begin{array}{r}-3\\7\\-6\end{array}\right) = \left(\begin{array}{c}a\\b\\2a\end{array}\right)$ has a solution, namely $\alpha = -\frac{7a+3b}{5}$, $\beta=-\frac{4a+b}{5}$ (obtained by Gaussian elimination). In particular, since no matter what $h$ is $\vec{v_3}$ lies in the plane $2z=x$, then we will have $\vec{v_3} = -\frac{28+3h}{5}\vec{v_1} - \frac{16+h}{5}\vec{v_2}.$ Note that this makes sense no matter what $h$ is.

This can be read off your row-reduced matrix: you got $\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & -5 & h+16 & 0\\ 0 & 0 & 0 & 0 \end{array}\right).$ Divide the second row by $-5$ to get $\left(\begin{array}{rrr|c} 1 & -3 & 4 & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right),$ and now add three times the second row to the first row to get $\left(\begin{array}{rrc|c} 1 & 0 & 4+\frac{-3h-48}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0 & 0 \end{array}\right) = \left(\begin{array}{rrc|c} 1 & 0 & -\frac{28+3h}{5} & 0\\ 0 & 1 & -\frac{h+16}{5} & 0\\ 0 & 0 & 0& 0 \end{array}\right).$ So $\alpha$ and $\beta$ are leading variables, and $\gamma$ is a free variable. This tells you that the solutions to the original system are: $\begin{align*} \alpha &= \frac{28+3h}{5}t\\ \beta &= \frac{h+16}{5}t\\ \gamma &= t \end{align*}$ Any nonzero value of $t$ gives you a nontrivial solution, and $t=-1$ gives you the solution I give above.

Of course, this can be done much more simply noting that since your original matrix has linearly dependent rows (third row is a scalar multiple of the first row), then the dimension of the rowspace is at most $2$ (in fact, exactly $2$), and hence the dimension of the columnspace is at most $2$ (in fact, exactly $2$, since $\dim(\text{columnspace})=\dim(\text{rowspace})$, so the columns are always linearly dependent.