9
$\begingroup$

There is a well-known isomorphism between the Lie algebra $\mathfrak{so}(3)$ and $\mathbb{R}^3$ which maps the Lie bracket to the vector cross product. It looks like

$$ \begin{pmatrix} 0 & -z &y\\ z & 0 & -x\\ -y & x & 0 \end{pmatrix} \mapsto \begin{pmatrix}x\\y\\z\end{pmatrix}. $$

There is a more geometric description to this isomorphism, which in broad strokes involves identifying $\mathbb{R}^3$ with $\Lambda^2(\mathbb{R}^3)$ via the Hodge star, and identifying $\mathfrak{so}(3)\subseteq \operatorname{End}(\mathbb{R}^3)\cong\mathbb{R}^3\otimes(\mathbb{R}^3)^*$ with $\Lambda^2(\mathbb{R}^3)$.

If you work in $\mathbb{R}^3$ with its canonical basis, canonical inner product, and canonical orientation, it's easy to see that these isomorphisms yield the specified result. Now I'm trying to check the details of these identifications for a general vector space (with dim = 3 injected into the argument when necessary), giving all the isomorphisms explicitly and without choosing a basis. I found this excellent answer by Qiaochu Yuan to be helpful for some one of the steps.

Let me walk through the steps, as I see them:

  1. First let's set notation. Let $V$ be an arbitrary real vector space with inner product $g$ and volume form $\Omega$. The automorphism group $SO(V)=\{O\in \operatorname{Aut}(V)\mid g(Ov,Ow)=g(v,w)\}$ has Lie algebra $\mathfrak{so}(V)=\{X\in \operatorname{End}(V)\mid g(Xv,w)+g(v,Xw)=0\}$
  2. Therefore the map defined by $\tilde{\alpha}(X)=\operatorname{eval}(\flat\circ X\otimes\operatorname{id})\colon v\otimes w\mapsto g(Xv,w)$ is skew-symmetric. Here $\operatorname{eval}$ is the evaluation map $V\otimes V^*$ which takes $v\otimes\sigma\mapsto \sigma(v)$, and $\flat$ is the canonical isomorphism $V\to V^*$ induced by the non-degenerate bilinear form $g$, given by $u\mapsto (v\mapsto g(u,v))$.
  3. Since $\tilde{\alpha}(X)\colon V\otimes V\to \mathbb{R}$ is skew-symmetric, it factors to a map $\alpha(X)\colon\Lambda^2(V)\to\mathbb{R}$. I.e. $\alpha(X)\in\Lambda^2V^*$. Now apply the inverse map $\sharp\colon V^*\to V$ to the first tensor factor, we have $\beta(X)=(\sharp\wedge\operatorname{id})\circ\alpha(X)\in V\wedge V^*$. To make this construction a little more concrete, observe that given a one parameter group of rotations in the plane fixed by two vectors $v,w$, one can check that the corresponding Lie group element is $\beta(X)=w\wedge v^\flat$.
  4. Apply $\sharp$ to the second tensor factor to get an element of the standard exterior algebra $\gamma(X)=(\operatorname{id}\wedge\sharp)\circ\beta(X)\in\Lambda^2V$.
  5. Apply the Hodge star operator to get an element of the vector space $\delta(X)=*\gamma(X)\in V.$
  6. Check that $\delta$ is an isomorphism of Lie algebras $\delta([X,Y])=\delta(X)\times\delta(Y)=*(\delta(X)\wedge\delta(Y))$ (Clearly it's linear. I suppose it should also be verified that $\delta$ is bijective)

So I have several problems.

  1. Step 2 above seems very convoluted and inelegant, with all the raising and lowering operators just to check whether something is an antisymmetric map.
  2. Step 2 is not just convoluted, but of dubious construction. Usually one considers the exterior power of a vector space with itself, not a vector space wedged with its dual space. That construction seems nonstandard to me, so I'm in unfamiliar territory and wonder whether this is the wrong path.
  3. I can't really see how to do step 6. I can't push the Lie bracket through all these opaque constructions.
  4. The $\operatorname{eval}$ map used in step 2 is also somewhat mysterious, at least in one direction. That is, there's really only a canonical injection $\operatorname{eval}\colon V\otimes V^*\to \operatorname{End}(V)$ given by $v\otimes\sigma\mapsto (w\mapsto \sigma(w)v)$. In the case that $V$ is not finite dimensional, this will not be an isomorphism and does not have an inverse. In the finite dimensional case, we do have an inverse, any endomorphism can be written as a product of vectors and dual vectors, but not canonically. Mapping $\mathfrak{so}(V)$ to $\Lambda^2(V)$ seems to require as an intermediate step mapping $\operatorname{End}(V)$ to $V\otimes V^*$. Can this be done canonically?
  5. Given the concerns about the direction of the map $V\otimes V^*\to \operatorname{End}(V)$, perhaps I should try to go the other way, but I can't make any progress that way either, because I don't know any identity for the Hodge star operator that will allow me to evaluate expressions like $*(X\wedge Y)$. How does the Hodge star operator interact with the wedge product?

Those are the issues I'm running into with my approach. I had somehow expected the I'd appreciate any resolutions to those issues, or alternatively if there is a better approach to this question, or some reason why the question itself is not a good one, I'd love to hear about it. Thanks.

2 Answers 2

3
  1. Convoluted and inelegant is in the eye of the beholder. Once you get used to identifying in your head everything which is related by a raising or lowering operator things aren't so bad.

  2. $V$ is an inner product space, so $V \otimes V^{\ast}$ can be canonically identified with both $V \otimes V$ and $V^{\ast} \otimes V^{\ast}$.

  3. I have some ideas for how to do this, but they're not particularly nice. One way is to use the representation theory of $\text{SO}(V)$. Because everything you've done is canonical, it's all $\text{SO}(V)$-equivariant. $V$ is an irreducible representation, so it follows by Schur's lemma that any two isomorphisms $\Lambda^2(V) \to V$ are a scalar multiple of each other. In particular, the Lie bracket and the exterior product are scalar multiples of each other, and from here you can probably finish by arguing using inner products.

  4. The inverse of a canonical map, when the inverse exists, is also canonical.

  5. See above.

  • 0
    Thank you, Qiaochu. So basically, you don't say that I'm making a mistake, or overlooking a simpler route. I should keep working on it. I was a little taken aback with how much more difficult this is was compared with just choosing a basis; I thought maybe I was overlooking something.2012-02-10
  • 0
    @ziggurism: well, I didn't say you weren't overlooking a simpler approach. I have some ideas, but the details seem a little messy right now, so I'll think about it.2012-02-10
  • 0
    @ziggurism: some hints. It is possible to more-or-less canonically extend an inner product on a vector space $V$ to an inner product on any exterior power $\Lambda^n(V)$ such that the exterior product of unit vectors in $V$ is a unit vector in $\Lambda^n(V)$. Now, the cross product is not only antisymmetric, but the inner product of two orthogonal unit vectors is another unit vector...2012-02-10
  • 0
    Looking over this again, I think the stuff about $V\wedge V^*$ is nonsense. Doesn't make sense to talk about wedge products of things from different vector spaces.2012-03-16
  • 0
    @ziggurism: what stuff?2012-03-16
  • 0
    I constructed a map $\tilde{\alpha}$ given by $\tilde{\alpha}(X)=\operatorname{eval}(\flat\circ X\otimes\operatorname{id})\colon v\otimes w\mapsto g(Xv,w)$. Then I said it factors to a map on $\Lambda V^2$, then I used a raising operator to lower the second factor, giving me an element which I called $w\wedge v^\flat\in V\wedge V^*$. This stuff does not make sense. The way wedge products are defined does not allow you to wedge two distinct vector spaces.2012-03-16
  • 0
    And now I think my steps 2-4 are needlessly convoluted. I went that route because I noticed that $g(X\cdot,\cdot)$ was antisymmetric, and therefore by the universal property is a map on $\Lambda^2 V$, and with a bunch of raising and lowering gymastics, turn the map into an element of $\Lambda^2 V$. But it's much simpler to just say there is an isomorphism $\Lambda^2 V = \mathfrak{so}(V)$ given by $u\wedge v\mapsto u\otimes v^\flat-v\otimes u^\flat$.2012-03-16
  • 0
    As for your hint, I can almost see how that goes: you can check that the Hodge dual of a wedge product in three dimensions gives a new vector which is orthogonal to both factors. And so does the vector cross product, and by uniqueness properties, we can declare them equal. I guess I was hoping for something more explicit. I was hoping to take some pretty basis-free formula for the Hodge star, apply it to some vectors, take the commutator (using the isomorphism), and then another Hodge star, and the vector cross product would appear. But I can't make headway against the formula $*([*u,*v])$.2012-03-16
  • 0
    I mean, isn't your hint saying basically to work in an orthonormal basis? That's an isomorphism to $\mathbb{R}^3$. I concede that if we work in $\mathbb{R}^3$ and identify all the isomorphic duals and such, the question becomes a lot easier. I was hoping the isomorphism would work out nicely in a basis-free context, and that this computation would somehow be aesthetically pleasing. Of course, just because I hope for a thing will look nice does not make it so...2012-03-16
  • 0
    I was googling around on this issue a little bit, and I came across [this closely related thread](http://mathoverflow.net/questions/33896/how-are-these-two-ways-of-thinking-about-the-cross-product-related) thread on MO from 2 years ago. Must be a good question!2012-03-17
1

This problem is nearly trivial in the index notation, not because passing to the index notation involves picking a basis, but simply because it's much easier express the relevant maps in that notation. For example in the comments to Qiaochu's answer you define the iso $\Lambda^2V\rightarrow\mathfrak{so}(3)$ by $a\wedge b\mapsto a\otimes b^\flat-b\otimes a^\flat$ and using the fact that every element of $\Lambda^2V$ is of the form $a\wedge b$ for some $a$ and $b$ (in dimension $>3$ we would have to write "a sum of elements of the form..."). But in the index notation we just let the metric be $g_{ab}$ and write $X^{ab}\mapsto X^{ac}g_{cb}$, with no need to pass to a decomposition into simple tensors.

The key fact in the proof using index notation is that $$\Omega_{abc}\Omega^{cde}=\delta^d_a\delta^e_b-\delta^e_a\delta^d_b$$ where $\Omega_{abc}$ is the volume form (usually we would write $\varepsilon_{abc}$) and $\Omega^{abc}$ is the corresponding volume form on the dual space (or equivalently just what you get when you rase all the indices of $\Omega_{abc}$ using the (inverse of the) metric).


If we want to give an index-free proof we will need something equivalent to the above identity. This turns out to be a formula for the wedge product of the duals of some things $$*(a\wedge b)\wedge*(c\wedge d)=a\wedge d\langle b,c\rangle-a\wedge c\langle b,d\rangle-b\wedge d\langle b,c\rangle+b\wedge c\langle a,d\rangle$$ (Warning! Works only in dimension $=3$)

If we pick $x,y\in V$ then in $\Lambda^2V$ they map to $*x,*y$. In order to map these into $\mathfrak{so}(3)$ we need to represent them as $*x=a\wedge b$ and $*y=c\wedge d$. Then these map to $\mathfrak{so}(3)$ to give $a\otimes b^\flat-b\otimes a^\flat$ and $c\otimes d^\flat-d\otimes c^\flat$. We can then take their Lie bracket in $\operatorname{End}(V)$, getting $8$ terms the first of which is $$(a\otimes b^\flat)\circ(c\otimes d^\flat)=a\otimes d^\flat\langle b,c\rangle.$$ If you write out these $8$ terms you'll spot that they are the image under our iso from $\Lambda^2V$ of $$a\wedge d\langle b,c\rangle-a\wedge c\langle b,d\rangle-b\wedge d\langle b,c\rangle+b\wedge c\langle a,d\rangle$$ which by our above identity is $$*(a\wedge b)\wedge*(c\wedge d)=x\wedge y$$. Mapping back into $V$ from $\Lambda^2V$ gives $*(x\wedge y)$, which is the cross-product!


So where does our identity above come from? I don't know. I just proved it using index notation. The general form is the following:

We're working in dimension $n$ and we want to know $$*(a_1\wedge\dots\wedge a_p)\wedge*(b_1\wedge\dots\wedge b_q).$$ Let $k=p+q-n$. Let $P=\{1,\dots,p\}$ and $Q=\{1,\dots,p\}$. For $K\subseteq P$ define $a_K$ to be $$a_{k_1}\wedge\dots\wedge a_{k_K}$$ where $k_1,\dots,k_K$ are the elements of $K$ in size order. Also, define $\Sigma K$ to be the sum of the elements of $K$. Similarly $b_L$ and $\Sigma L$ for $L\subseteq Q$.

Then $$*(a_1\wedge\dots\wedge a_p)\wedge*(b_1\wedge\dots\wedge b_q)=\sum_{|K|=k}\sum_{|L|=k}(-1)^{\Sigma K+\Sigma L}a_{P\setminus K}\wedge b_{Q\setminus L}\left\langle a_K,b_L\right\rangle$$ where the inner product is induced on $\Lambda^kV$ by the one on $V$.

I can prove this formula using index notation, but surprisingly I can't find any references for it. Perhaps I will ask a question to see if anyone recognises it and perhaps they'll be able to give an index-free proof.

  • 0
    The index free approach can be done in clifford algebra also, in which the result reads $(\Omega x) \times (\Omega y) = x \wedge y$, where $\times$ here denotes the "commutator product." The index notation identity you give is manifestly $(a \wedge b) \cdot (e \wedge d) = (a \cdot d)(b \cdot e) - (a \cdot e)(b \cdot d)$.2014-11-07