1
$\begingroup$

How would I be able to prove this question:

Let $x$ and $y$ be linearly independent elements of a vector space $V$. Show that $u = ax+by$ and $v=cx+dy$ are linearly independent if and only if $ad-bc$ does not equal $0$.

I know that $u$ and $v$ are linear combinations of $V$ which will make them span $V$. Also, if the determinant is equal to $0$ then it will be a singular matrix and if it is singular then it will have free variables which will make it dependent, but how can I show this mathematically?

  • 1
    Why do you say that $u$ and $v$ span $V$? $V$ could be $100$-dimensional. Step 1 of learning how to write mathematics is being careful with the meanings of your words.2012-09-22
  • 0
    Opps, you are right. I am sorry for that! Thank you for clarifying.2012-09-22

2 Answers 2

4

You don’t have any reason to think that $x$ and $y$ span $V$: all you know is that they are linearly independent. The dimension of $V$ might well be greater than $2$, in which case no two-element subset of $V$ will span $V$, though many will be linearly independent.

You have two things to show:

  1. If $ad-bc\ne0$, then $u$ and $v$ are linearly independent.
  2. If $u$ and $v$ are linearly independent, then $ad-bc\ne0$.

It’s probably easiest to prove (1) by proving the contrapositive: if $u$ and $v$ are linearly dependent, then $ad-bc=0$. That’s because the assumption of linear dependence gives you something very concrete to work with: if $u$ and $v$ are linearly dependent, there are scalars $\alpha$ and $\beta$, at least one of which is non-zero, such that $\alpha u+\beta v=0$. Now write this out in terms of $x$ and $y$:

$$\alpha(ax+by)+\beta(cx+dy)=0\;.$$

Collect the $x$ and $y$ terms on the lefthand side:

$$(\alpha a+\beta c)x+(\alpha b+\beta d)y=0\;.$$

By hypothesis $x$ and $y$ are linearly independent, so

$$\left\{\begin{align*} &\alpha a+\beta c=0\\ &\alpha b+\beta d=0\;. \end{align*}\right.$$

This says that $\begin{bmatrix}\alpha\\\beta\end{bmatrix}$ is a non-zero solution to the equation

$$\begin{bmatrix}a&b\\c&d\end{bmatrix}z=0\;.$$

what does that tell you about $\det\begin{bmatrix}a&b\\c&d\end{bmatrix}$ and hence about $ad-bc$?

To prove (2), again go for the contrapositive: if $ad-bc=0$, then $u$ and $v$ are linearly dependent. You can pretty much just reverse the reasoning in the argument that I outlined for (1).

  • 0
    Beautiful, thank you very much. I understand now thanks to your help!!!2012-09-22
1

For one direction, suppose $ad-bc \ne 0$, and suppose $\lambda, \mu \in \mathbb{R}$ satisfy $$\lambda u + \mu v = 0$$ Show that this implies that $\lambda = \mu = 0$, and hence that $u$ and $v$ are linearly independent.

For the other direction, suppose $ad-bc = 0$ and find nonzero $\lambda, \mu$ satisfying the above equation.

Post in the comments if you need more help.

  • 0
    I am going to replace your lambda and mu with i and j respectively because I don't know how to write that here. So, I got iu+jv=0 which implies that iu=-jv then if we replace iu we get that -jv+jv=0 and if we factor out j we get that j(v-v)=0 thus j(0)=0 -> j=0. Is that right?2012-09-22
  • 0
    Absolutely not; $\mu \cdot 0 = 0$ for any $\mu$ so it certainly doesn't imply $\mu = 0$. You need to write $u$ and $v$ in the above equation in terms of $x$ and $y$, and collect terms. Then use the raw definition of linear independence of $x$ and $y$.2012-09-22
  • 1
    I am going to go out for a jog to clear my head because that is a stupid mistake I made. Brian above provided me a detailed instruction so I understand now. But still thank you a lot Clive!2012-09-22