4
$\begingroup$

Given two $3$D vectors $\mathbf{u}$ and $\mathbf{v}$ their cross-product $\mathbf{u} \times \mathbf{v}$ can be defined by the property that, for any vector $\mathbf{x}$ one has $\langle \mathbf{x} ; \mathbf{u} \times \mathbf{v} \rangle = {\rm det}(\mathbf{x}, \mathbf{u},\mathbf{v})$. From this a number of properties of the cross product can be obtained quite easily. It is less obvious that, for instance $|\mathbf{u} \times \mathbf{v}|^2 = |\mathbf{u}|^2 |\mathbf{v}|^2 - \langle \mathbf{u} ; \mathbf{v} \rangle ^2$, from which the norm of the cross-product can be deduced.

Is it possible to obtain these properties nicely (i.e. without dealing with coordinates), but with elementary linear algebra only (i.e. without the exterior algebra stuff, only properties of determinants and matrix / vector multiplication).

Thanks in advance!

  • 0
    Maybe to define cross product with the use of skew symmetric matrix i.e $u \times v=S(u)v$ would be useful..2017-02-03
  • 0
    Thanks for the suggestion, I already tried that, it leads to some computations that are at least as long as those arising from the coordinate definition.2017-02-03

3 Answers 3

0

The attempt to prove

$|\mathbf{u} \times \mathbf{v}|^2 - |\mathbf{u}|^2 |\mathbf{v}|^2 + \langle \mathbf{u} ; \mathbf{v} \rangle ^2=0$,

with the use of formula

$\mathbf{u} \times \mathbf{v}= \mathbf {S(u)v}$ where $\mathbf {S(u)}$ is skew-symmetric matrix.

For simplification let normalize $|\mathbf{u}|=1$.

The formula can be written:

$\mathbf {(S(u)v)}^T \mathbf {S(u)v}-(\mathbf{v}^T \mathbf{v})( \mathbf{u}^T \mathbf{u}) +(\mathbf{v}^T \mathbf{u})(\mathbf{u}^T \mathbf{v})$=
$\mathbf {v^TS(u)^T} \mathbf {S(u)v} - \mathbf{v}^T \mathbf{v} \mathbf{u}^T \mathbf{u} +\mathbf{v}^T \mathbf{u}\mathbf{u}^T \mathbf{v}$=
$\mathbf {-v^TS^2(u)} \mathbf {v} -\mathbf{v}^T \mathbf{v} \mathbf{u}^T \mathbf{u} + \mathbf{v}^T \mathbf{u}\mathbf{u}^T \mathbf{v}$=
$\mathbf {-v^T( uu^T-I)} \mathbf {v} - \mathbf{v}^T \mathbf{v} \mathbf{u}^T \mathbf{u} +\mathbf{v}^T \mathbf{u}\mathbf{u}^T \mathbf{v}$=
$ -\mathbf {v^T uu^T v}+ \mathbf {v}^T \mathbf {v} - \mathbf{v}^T \mathbf{v} \mathbf{u}^T \mathbf{u} +\mathbf{v}^T \mathbf{u} \mathbf{u}^T \mathbf{v}$ =
$ \mathbf {v}^T \mathbf {v} - \mathbf{v}^T \mathbf{v} \mathbf{u}^T \mathbf{u}$=
$\mathbf {v}^T \mathbf {v} (1-\mathbf {u}^T \mathbf {u})= 0$.

So in this case it is fulfilled. I hope all steps are understood.

  • 0
    Great! That's exactly what I was looking for! The normalization step appears to be very important. My previous attempt used components, so I ran into computations... It must be said that the $\mathbf{u}^t\mathbf{u} - I = S^2$ needs, in fact, some "dirt" work.2017-02-03
  • 0
    I am a little surprised myself that transformations above have led to the correct result...crucial was here to express squares of lengths with the use of scalar prod. (which is commutative) ... also important is that the square of skew.-sym. matrix 3x3 can be expressed with the use of projection matrix..This formula $uu^T−I=S^2$ can be deduced also from the fact that sk.-sym. matrix $S$ is also a composition of projection onto a plane orthogonal to $u$ -> $P=1-uu^T$ and rotation made in this plane by π/2 so the square is a composition of this projection and rotation by $\pi$ -> $S^2=uu^T−I$ .2017-02-04
1

Hint

$$|\mathbf{u}\times \mathbf{v}| = |\mathbf{u}|\cdot|\mathbf{v}|\cdot|\sin \alpha|$$ $$\langle \mathbf{u}, \mathbf{v}\rangle = |\mathbf{u}|\cdot|\mathbf{v}|\cdot|\cos \alpha|$$ where $\alpha$ is an angle between vectors $\mathbf{u}$ and $\mathbf{v}$

  • 0
    Thanks for your time. I wasn't clear enough, sorry. I can use the cosine of the angle (since it relies only on Cauchy-Schwartz). But I want to obtain the first identity you mention, ideally without using coordinates. Of course I am aware of what you point, but I want to deduce these properties from the definition using the determinant.2017-02-03
1

I went through some books and found something I am happier with. It comes from Euclidean and Non-Euclidean Geometry: An Analytic Approach, by Ryan (p.85, or something like this)

Essentially it goes as follows : $\mathbf{n} = \mathbf{u} \times \mathbf{v}$ is defined by the property that for every $\mathbf{x}$ one has $\langle \mathbf{x} ; \mathbf{n}\rangle = \det( \mathbf{x},\mathbf{u},\mathbf{v})$.

  • Antisymetry and linearity follow directly from the corresponding properties of the determinant.

  • It is also easy to get $\langle \mathbf{u} ; \mathbf{v} \times \mathbf{w} \rangle = \langle \mathbf{u}\times \mathbf{v} ; \mathbf{w}\rangle$;

  • Using linearity, and restricting in a clever way to the basis vectors one shows that $\mathbf{u}\times (\mathbf{v}\times \mathbf{w}) = \langle \mathbf{u};\mathbf{w} \rangle \mathbf{v} - (\mathbf{u};\mathbf{v})\mathbf{w}$. This is the only part in which some "dirty" work is needed, but that is not too bad : using symmetry argument and linearity, one really needs very little computations.

  • Using the last, one gets $\langle \mathbf{u}\times \mathbf{v}; \mathbf{w}\times \mathbf{z}\rangle = \langle\mathbf{u};\mathbf{w} \rangle \langle\mathbf{v};\mathbf{z} \rangle - \langle\mathbf{v};\mathbf{w} \rangle \langle \mathbf{u};\mathbf{z} \rangle $

  • From this one gets the Lagrange identity, which, by the way, allows to get another proof of Cauchy - Schwartz