1
$\begingroup$

Suppose one interprets quadratic polynomials (i.e., parabolas) \begin{eqnarray} F(x) &=& a x^2 + b x + c\\ G(x) &=& d x^2 + e x + f \end{eqnarray} as vectors $(a,b,c)$ and $(d,e,f)$ in $\mathbb{R}^3$. Then the formal cross-product of these vector coefficients is $$ (a,b,c) \times (d,e,f) = (-c e + b f, c d - a f, -b d + a e) \;, $$ which might be interpreted as $$ F{\times}G(x) = (-c e + b f) x^2 + (c d - a f) x + (-b d + a e) \;. $$

Is there some way to view the cross-product of polynomial vectors as another polynomial that is orthogonal to the originals, in a sense analogous to vectors in $\mathbb{R}^3$? So I would expect that, for generic polynomials, for some inner product (perhaps with a weighting function), $$ \int F(x) [F(x) \times G(x)] dx = 0 $$ In other words,

Is there some viewpoint from which the cross-product of two polynomials $F$ and $G$ yields a polynomial $F {\times} G$ which is orthogonal to both $F$ and $G$?

I illustrate a few special cases below, e.g., when $G(x) = s F(x)$ for some scale factor, then $F{\times}G(x)=0$. But in general I do not see a way to interpret $F{\times}G$ as "orthogonal" in some sense to $F$ and $G$.


PolyCross4
$G$ is a constant times $F$.     $F$ and $G$ are linear: $a=d=0$.
$F$ and $G$ have no constant term: $c=f=0$.     $F$ and $G$ are centered on the $y$-axis: $b=e=0$.


  • 0
    Why do you need an extra $x$ ar the beginning of the RHS of the fourth formula ?2017-01-10
  • 0
    @JeanMarie: I removed that derivation, which I agree was unnatural.2017-03-22

1 Answers 1

2

Yes, but I'm not sure how interesting it is. You have a linear isomorphism $T \colon \mathbb{R}_{\leq 2}[x] \rightarrow \mathbb{R}^3$ sending a polynomial $p(x) = ax^2 + bx + c$ to the coefficients vector $(a,b,c)$. On $\mathbb{R}^3$, you have the cross product operation $\times$ and so you can use $T$ to transfer it to $\mathbb{R}_{\leq 2}[x]$ by defining

$$ p \times' q := T^{-1}(T(p) \times T(q)). $$

The space $\mathbb{R}^3$ also has the standard inner product $\left< \cdot, \cdot \right>$ and so you can also use $T$ to transfer this inner product to $\mathbb{R}_{\leq 2}[x]$ by defining

$$ \left< p, q \right>' := \left.$$

With this definition, $T$ becomes an isometry and so it preserves angles and lengths. In particular, since $T(p) \times T(q)$ is orthogonal to both $T(p)$ and $T(q)$ in $\mathbb{R}^3$, we'll have that $p \times' q$ is orthogonal to both $p,q$:

$$ \left< p \times' q, p \right>' = \left = \left< T(p) \times T(q), T(p) \right> = 0 $$

(and similarly for $q$). You can write down the formulas for $\times'$ and $\left< \cdot, \cdot \right>'$ explicitly. They will look like the familiar formulas from $\mathbb{R}^3$:

$$ (ax^2 + bx + c) \times' (dx^2 + ex + f) = (bf - ce)x^2 - (af - cd)x + (ae - bd), \left< ax^2 + bx + c, dx^2 + ex + f \right>' = ad + be + cf. $$

You can even describe the inner product $\left< p, q \right>'$ as an integration on some interval times an appropriate weight function (there are infinitely many possible choices for such a weight function).

  • 0
    Of course if you'll use a different linear isomorphism, you'll get a different notion of cross product and a different notion of orthogonality for polynomials such that the cross product of two polynomials will be orthogonal to the polynomials.2017-01-10