7
$\begingroup$

I'm kicking myself over this one, but I just can't seem to make the argument rigorous. From Axler's Linear Algebra Done Right:

for a vector space $V$ with an underlying field $F$:

Take an element $a$ from $F$ and $\vec{v}$ from $V$. $a\vec{v}=\vec{0}\implies a=0 $ or $ \vec{v}=\vec{0}$

After only being able to come up with half of a direct proof, I tried doing this by proving the contrapositive $a\neq 0 \wedge \vec{v} \neq \vec{0} \implies a\vec{v}\neq \vec{0}$

Say $a\vec{v}=\vec{u}$. Since $a$ is non-zero, we can divide both sides by $a$.

$\vec{v}=\frac 1 a \vec{u}$

If $\vec{u}$ were $\vec{0}$ then by

$\frac 1 a \vec{0}=\frac 1 a (\vec{0}+\vec{0})\implies\frac 1 a \vec{0}=\vec{0}$ $v$ would be $0$ as well. Since it isn't by assumption, $\frac 1 a \vec{u}$ cannot be zero and so $\vec{u}$ cannot be as well.

  1. Is this fully rigorous? It seems like a very simple question, but I'm not sure about it. Namely, the last step of $\frac 1 a \vec{u}\neq 0 \implies \vec{u}\neq 0$ doesn't seem obvious. I think I need to use the $1\vec{v}=\vec{v}$ axiom, but I'm not sure how.
  2. Is there a more direct proof? This whole contrapositive business seems a bit clunky for something so simple.
  • 0
    and of course I manage to miss that and somehow use a contrapositive anyway. Thanks.2012-07-13

3 Answers 3

12

Let $a\in F,\vec v\in V$ and suppose $a\vec v=0$. If $a\neq 0$, then $a^{-1}\in F$ so $\vec v=(a^{-1}a)\vec v=a^{-1}(a\vec v)=a^{-1}\vec 0=\vec 0$ thus either $a=0$ or $\vec v=0$.

3

If you want to prove that "P or Q" holds, it is often useful to assume that one of the conditions fails, from which it may follow readily that the other must hold (which proves the statement).

In this case, we know that non-0 field elements have multiplicative inverses, and can show easily that scalar multiples of the zero vector will again be the zero vector. Then if we assume $a\neq 0$, it readily follows that $v=0$.

2

We can also show the contrapositive: If $a \neq 0$ and $v \neq 0$ implies $av \neq 0.$

Assume $a\neq 0$ and $v \neq 0.$ Let $V$ has dimension $n,$ and express $v$ in terms of the standard basis vectors: $v = v_1 e_1 + \ldots + v_n e_n.$ Then by our assumption that $v \neq 0$ and by the definition of linear independence, at least one of the $v_i$'s is non-zero. Now, multiply $v$ by $a$ to get $av = (av_1) e_1 + \ldots + (av_n) e_n.$ But since $a \neq 0,$ at least one of the $av_i$'s will be non-zero, and hence the non-zero linear combination $(av_1) e_1 + \ldots + (av_n) e_n$ cannot equal to the $0$ vector. So, $av \neq 0.$