7
$\begingroup$

In my book the result $(u\times v)\cdot(x\times y)=\begin{vmatrix} u\cdot x & v\cdot x \\u \cdot y & v \cdot y\end{vmatrix},$ where u, v, x and y are arbitrary vectors, is stated (here '$\cdot$' means the dot product and '$\times$' is the cross product). The book very briefly says that this can be easily done by observing that both sides are linear in u, v, x and y.

I know that if I expand and simply the LHS using the components of a vector the result will be true. However, I don't really understand what it means when the book says ' both sides are linear in u, v, x and y ' and how by noticing this fact, makes this relation easier to prove.

Any help will be greatly appreciated.

  • 0
    One might also note that both sides of the equation are equal to \left| \begin{matrix} x \\ y \end{matrix} \right| \cdot \left | \begin{matrix} u & v \end{matrix} \right|.2012-10-09

2 Answers 2

4

Suppose we have two linear functions, $f$ and $g$, which agree on all the basis vectors of some space. Then they must agree for every vector on that space, because they are both linear, and a linear function is completely determined by its values on the basis.

In gory detail, suppose that we know that $f(\vec{e_i}) = g(\vec{e_i})$ for each basis vector $\vec{e_i}$.

Consider some vector $\vec v$. We can express $\vec v$ a a linear combination of basis vectors, say as $\vec v = c_1\vec{e_1} + \cdots + c_n\vec{e_n}.$ Then we know that $\begin{align} f(\vec v) & = f(c_1\vec{e_1} + \cdots + c_n\vec{e_n}) \\ & = c_1f(\vec{e_1}) + \cdots + c_nf(\vec{e_n}) & \text{(linearity of $f$)} \\ & = c_1g(\vec{e_1}) + \cdots + c_ng(\vec{e_n}) & \text{($f=g$ for basis vectors)} \\ & = g(c_1\vec{e_1} + \cdots + c_n\vec{e_n}) & \text{(linearity of $g$)}\\ & = g(\vec v) \end{align} $

One can similarly show an analogous fact for functions of several variables. For example, if $f(u,v,x,y)$ and $g(u,v,x,y)$ are linear functions of $u, v, x,$ and $y$, and if they agree on all combinations of the basis vectors for some space, then they agree on every vector in that space.

Now take $f(u,v,x,y) = (u\times v)\cdot(x\times y)$ and $g(u,v,x,y) =\begin{vmatrix} u\cdot x & v\cdot x \\u \cdot y & v \cdot y\end{vmatrix}$. These are easily seen to be linear, or easy to show to be linear if you don't see it, using properties of cross and dot products (for $f$) and of determinants and dot products (for $g$). So if you can show that they are equal when $u,v,x,$ and $y$ are basis vectors, you are done. And for most choices of basis vectors as arguments, both sides are equal to zero, so this is quick to verify.

  • 1
    Thanks for the clear and detailed explanation, it helped a lot.2012-10-09
5

It means that you can verify the relation just using the standard basis $\{e_1, e_2, e_3 \}$ of three dimensional space. For example you should check $(e_1 \times e_2)\cdot (e_2 \times e_3) =0$ which is the same as the right hand side.

  • 3
    Any vector is a linear combination of these three basis elements. If a relation is linear, it is enough to verify it on just one basis of the vector space. Suppose you want to show $u\cdot v=v\cdot u$. If it is correct on the $e_i$ then write $u$ and $v$ as linear combinations of the $e_i$, use linearity and you will see the result.2012-10-09