2
$\begingroup$

this formula just pop up in textbook I'm reading without any explanation
$ (\vec{A} \times \vec{B}) \times \vec{C} = (\vec{A}\cdot\vec{C})\vec{B}-(\vec{B}\cdot\vec{C})\vec{A}$
I did some "vector arithmetic" using the determinant method
but I'm not getting the answer to agreed with above formula.
I'm wondering if anyone saw this formula before and know the proof of it?

the final result that i get is
$b_{1}(a_{3}c_{3}+a_{2}c_{2})-a_{1}(b_{2}c_{2}+b_{3}c_{3})i$
$b_{2}(a_{3}c_{3}+a_{1}c_{1})-a_{2}(b_{3}c_{3}+b_{1}c_{1})j$
$b_{3}(a_{2}c_{2}+a_{1}c_{1})-a_{3}(b_{2}c_{2}+b_{1}c_{1})k$

But I failed to see any correlation for $(\vec{A}\cdot\vec{C})$ and $(\vec{B}\cdot\vec{C})$ part...

  • 0
    $(x_1,y_1,z_1)\cdot(x_2,y_2,z_2)=(x_1x_2,y_1y_2,z_1z_2)$2013-07-16

5 Answers 5

6

Let $a=\begin{pmatrix} a_1 \\ a_2 \\ a_3 \end{pmatrix}$ and likewise for $b,c.$ The first component of $a\times (b \times c)$ is seen by applying the definition to be $a_2(b_1c_2 - b_2c_1) - a_3(b_3c_1-b_1c_3).$ With some algebra, this is seen to be $(a_2c_2+a_3c_3)b_1 - (a_2b_2+a_3b_3)c_1.$ The tricky transformation to apply here is simulataneously adding and subtracting the quantity $a_1b_1c_1.$ This allows us now to write the first component as $(a_1c_1 + a_2c_2 +a_3c_3)b_1 - (a_1b_1 + a_2b_2 + a_3b_3)c_1.$ Applying similar arguments to the second and third coordinates, we see that $a\times (b\times c) = \begin{pmatrix}(a_1c_1 + a_2c_2 +a_3c_3)b_1 - (a_1b_1 + a_2b_2 + a_3b_3)c_1 \\ (a_1c_1 + a_2c_2 +a_3c_3)b_2 - (a_1b_1 + a_2b_2 + a_3b_3)c_2 \\ (a_1c_1 + a_2c_2 +a_3c_3)b_3 - (a_1b_1 + a_2b_2 + a_3b_3)c_3 \end{pmatrix} = (a\cdot c)b - (a\cdot b)c$

  • 0
    ah yes, I got stuck in 'tricky transformation' part.I know there some hole missing as the final answer that I derived look almost the same with the correct equation that is given.It made sense now, thanks for your help.2012-07-07
3

vector $ \vec A\times \vec B$ is perpendicular to the plane containing $\vec A $ and $\vec B$.Now, $(\vec A\times \vec B)\times \vec C$ is perpendicular to plane containing vectors $\vec C$ and $\vec A\times \vec B$, thus $(\vec A\times \vec B)\times \vec C$ is in the plane containing $\vec A$ and $ \vec B$ and hence $(\vec A\times \vec B)\times \vec C$ is a linear combination of vectors $\vec A$ and $ \vec B\implies (\vec A\times \vec B)\times \vec C=\alpha \vec A + \beta \vec B$. Take dot product with $\vec C$ both sides gives $0$ on the L.H.S(as $(\vec A\times \vec B)\times \vec C$ is perpendicular to $\vec C$), hence $0=\alpha (\vec A. \vec C)+\beta(\vec B.\vec C)\implies \frac{\beta}{\vec A. \vec C}=\frac{-\alpha}{\vec B. \vec C}=\lambda \implies \alpha=-\lambda(\vec B. \vec C)$ and $\beta=\lambda(\vec A. \vec C) \implies (\vec A\times \vec B)\times \vec C=\lambda((\vec A. \vec C)\vec B-(\vec B. \vec C)\vec A)$. Here, $\lambda$ is independent of the magnitudes of vectors as if magnitude of any vector is multiplied by any scalar, that appears explicitly on both sides of the equation.Thus , putting unit vectors $\vec i,\vec j,\vec i$ in the equation gives $\vec j=\lambda(\vec j)\implies \lambda=1$ and hence $(\vec A\times \vec B)\times \vec C=((\vec A. \vec C)\vec B-(\vec B. \vec C)\vec A)$

  • 0
    Why is $\lambda$ independent from vectors $A,B,C$?2013-07-16
2

$\def\A{{\bf A}} \def\B{{\bf B}} \def\C{{\bf C}} \def\x{\times} \def\o{\cdot} \def\d{\delta} \def\e{\epsilon}$I have always found such products easiest to discover using the properties of the Kronecker delta and the Levi-Civita symbol.

Note that $\A\o\B = A_i B_j \d_{ij}$ and $(\A\x\B)_k = A_i B_j \e_{ijk}$, where we use the Einstein summation convention. Also, $\e_{ijk}\e_{lmk} = \e_{ijk}\e_{klm} = \d_{il}\d_{jm} - \d_{im}\d_{jl}$. Then \begin{eqnarray*} ((\A\x\B)\x\C)_m &=& A_i B_j \e_{ijk} C_l \e_{klm} \\ &=& A_i B_j C_l(\d_{il}\d_{jm} - \d_{im}\d_{jl}) \\ &=& (\A\o\C)B_m - (\B\o\C)A_m. \end{eqnarray*} Therefore, $(\A\x\B)\x\C = (\A\o\C)\B - (\B\o\C)\A,$ as claimed.

Addendum: In proving such identities, there is always the question of what formalism you have to work with. My recommendation is to have the Levi-Civita symbol as it is more fundamental than the cross product. The result $\e_{ijk}\e_{lmk} = \d_{il}\d_{jm} - \d_{im}\d_{jl}$ is a consequence of the general properties of $\e$ but I give here a simple proof.

The product $\e_{ijk}\e_{lmk}$ is antisymmetric in $ij$ and $lm$. Thus, $\e_{ijk}\e_{lmk} = c \left(\d_{il}\d_{jm} - \d_{im}\d_{jl}\right)$. But $\e_{123} = 1$, and so $1 = c\left(\d_{11}\d_{22} - \d_{12}\d_{21}\right)$. Therefore, $c=1$.

  • 0
    @RahulNarain: You can think of all the combinations of (1, 2, 3) manually. Say $k=1$. Now there is only the $i = l$ and $j = m$ case, or the other way around. That will lead you into the right direction.2013-01-27
2

Before proving this result,I prove a more general result:

Let $f(x,y,z)$ and $g(x,y,z)$ are two $3-$linear map on $\mathbb{R}^3$ with standard ordered basis $\{e_1,e_2,e_3\}$. If $f(e_i,e_j,e_k)=g(e_i,e_j,e_k)$ for all $i,j,k \in \{1,2,3\}$,then $f(x,y,z)=g(x,y,z)$ for all $(x,y,z) \in (\mathbb{R}^3)^3$.

Proof: For each $(x,y,z) \in (\mathbb{R}^3)^3$,let $x=\sum x_ie_i$, $y=\sum y_je_j$ and $z=\sum z_ke_k$.Then $f(x,y,z)=\sum_i \sum_j \sum_k x_iy_jz_kf(e_i,e_j,e_k)=\sum_i \sum_j \sum_k x_iy_jz_kg(e_i,e_j,e_k)=g(x,y,z)$.

Now,let $f(x,y,z)=(x×y)×z$,$g(x,y,z)=(x⋅z)y-(y⋅z)x$.It is not difficult to see that $f(e_i,e_j,e_k)=g(e_i,e_j,e_k)$.As a result, $f(x,y,z)=g(x,y,z)$ on $(\mathbb{R}^3)^3$ by the lemma. Hence,$(x×y)×z=(x⋅z)y-(y⋅z)x$.

For your reference:Multilinear map

2

Putting the vectors U,V,W in terms of the unit vectors

$\begin{pmatrix} \boldsymbol{U}\\ \boldsymbol{V}\\ \boldsymbol{W} \end{pmatrix}=\begin{pmatrix} u_1\boldsymbol{i}+u_2\boldsymbol{j}+u_3\boldsymbol{k}\\ v_1\boldsymbol{i}+v_2\boldsymbol{j}+v_3\boldsymbol{k}\\ w_1\boldsymbol{i}+w_2\boldsymbol{j}+w_3\boldsymbol{k} \end{pmatrix}\\ \boldsymbol{U\times(V\times W)}=(u_1\boldsymbol{i}+u_2\boldsymbol{j}+u_3\boldsymbol{k})\begin{vmatrix} \boldsymbol{i}&\boldsymbol{j}&\boldsymbol{k}\\ v_1&v_2&v_3\\ w_1&w_2&w_3 \end{vmatrix}\\ =\begin{vmatrix} \boldsymbol{i}&\boldsymbol{j}&\boldsymbol{k}\\ u_1&u_2&v_3\\ \begin{vmatrix} v_2&v_3\\ w_2&w_3 \end{vmatrix}&\begin{vmatrix} v_3&v_1\\ w_3&w_1 \end{vmatrix}&\begin{vmatrix} v_1&v_2\\ w_1&w_2 \end{vmatrix} \end{vmatrix}\\ =(u_2(v_1w_2-v_2w_1)-u_3(v_3w_1-v_1w_3))\boldsymbol{i}+(u_3(v_2w_3-v_3w_2)-u_1(v_1w_2-v_2w_1))\boldsymbol{j}+(u_1(v_3w_1-v_1w_3)-u_2(v_2w_3-v_3w_2))\boldsymbol{k}\\ =(\color{GoldenRod}{v_1}(\color{GoldenRod}{u_1w_1}+u_2w_2+u_3w_3)-\color{GoldenRod}{w_1}(\color{GoldenRod}{u_1v_1}+u_2v_2+u_3v_3))\boldsymbol{i}+(\color{GoldenRod}{v_2}(u_1w_1+\color{GoldenRod}{u_2w_2}+u_3w_3)-\color{GoldenRod}{w_2}(u_1v_1+\color{GoldenRod}{u_2v_2}+u_3v_3))\boldsymbol{j}+(\color{GoldenRod}{v_3}(u_1w_1+u_2w_2+\color{GoldenRod}{u_3w_3})-\color{GoldenRod}{w_3}(u_1v_1+u_2v_2+\color{GoldenRod}{u_3v_3}))\boldsymbol{k}\\ =(u_1w_1+u_2w_2+u_3w_3)(v_1\boldsymbol{i}+v_2\boldsymbol{j}+v_3\boldsymbol{k})-(u_1v_1+u_2v_2+u_3v_3)(w_1\boldsymbol{i}+w_2\boldsymbol{j}+w_3\boldsymbol{k})$ $=(\boldsymbol{W}\bullet\boldsymbol{U})\boldsymbol{V}-(\boldsymbol{U}\bullet\boldsymbol{V})\boldsymbol{W}\quad$ Since the cross products are anticommutative and the bullet products are commutative, $\boldsymbol{(V\times W)\times U}=(\boldsymbol{V}\bullet\boldsymbol{U})\boldsymbol{W}-(\boldsymbol{W}\bullet\boldsymbol{U})\boldsymbol{V}$ Substituting (V,W,U)=(A,B,C),

$\boldsymbol{(A\times B)\times C}=(\boldsymbol{A}\bullet\boldsymbol{C})\boldsymbol{B}-(\boldsymbol{B}\bullet\boldsymbol{C})\boldsymbol{A}\quad$

B)) For the vectors switched as below and having a sum, $\boldsymbol{A}\to\boldsymbol{B}\to\boldsymbol{C}\to\boldsymbol{A}$ we have an interesting stuff!! OBDDD $\boldsymbol{A\!\times\!(B\!\times\!C)+B\!\times\!(C\!\times\!A)+C\!\times\!(A\!\times\!B)}=\boldsymbol{0}$