I was recently reviewing an old exam and noticed I lost marks for the following Q/A. I cannot for the life of me figure out why this was the case. If someone could highlight what I did incorrectly and suggest a correction, that would be greatly appreciated.
Question:
Prove $a \cdot b = 0 \iff a = 0 \vee b = 0$ where $a$ and $b$ are elements of a field $F$.
Answer:
First we show $a = 0 \vee b = 0 \Longrightarrow a \cdot b = 0$.
Let $x \in F$ be arbitrary. Then $0 \cdot x = 0 \cdot x + 0$ But $x \in F$, so there exists an additive inverse $(-x)$ for $x$. Hence $ 0 \cdot x + 0 = 0 \cdot x + (x + (-x))$ Using the associativity of addition and the distributivity of multiplication over addition in $F$ we have $0 \cdot x + (x + (-x)) = (0 \cdot x + x) + (-x) = x \cdot (0 + 1) + (-x)$ Thus $0 \cdot x = x \cdot 1 + (-x) = 0$ Because $x$ was arbitrary, we can conclude that $0$ multiplied by any element of $F$ is $0$. Given that $a = 0 \vee b = 0$, it follows that $a \cdot b = 0$.
Next we show $a \cdot b = 0 \Longrightarrow a = 0 \vee b = 0$. If we assume the contrary, we have $a \cdot b = 0 \wedge (a \ne 0 \wedge b \ne 0)$ We know that there exists a multiplicative inverse $a^{-1}$ for $a$, so $a^{-1} \cdot a \cdot b = a^{-1} \cdot 0$ But (remembering that associativity holds) this reduces to $1 \cdot b = 0$, which is a contradiction. Hence it follows that $a \cdot b = 0 \iff a = 0 \vee b = 0$