1
$\begingroup$

Consider a real polynomial

$ f(x) := \sum_{k=0}^n c_k x^k $

of degree $n$, where $c_k \in \mathbb{R}$ are constant coefficients. Given $n+1$ samples $f_i$ of $f$ at distinct sample points $x_i$, one can recover the coefficients by solving the linear system

$ f(x_i) = f_i. $

Easy enough. However, the problem seems to get harder when $f$ has a special form. For instance, suppose that

$f(x) := (ax-b)^2$

where $a,b \in \mathbb{R}$ are now the unknown coefficients. This function describes a parabola whose minimum is zero -- in other words, it "sits" on the x-axis. Since this polynomial has fewer degrees of freedom, you would think you could recover its coefficients using fewer samples -- in particular, you would think two samples would suffice (one to recover the center, the other to recover the width). Surprisingly enough, however, actually computing these coefficients seems more challenging: we now have to solve a pair of simultaneous quadratic equations

$ (ax_1-b)^2 = f_1 $

$ (ax_2-b)^2 = f_2 $

In reality, I'm interested in the case where the domain is $\mathbb{R}^2$, not $\mathbb{R}$. In this case, I want to recover the three parameters $a$, $u$, and $v$ of the isotropic paraboloid

$ g(x,y) := (ax-u)^2 + (ay-v)^2 $

from three samples $(x_i,y_i)$. Again, it seems that three samples should suffice: two for the center and one for the "width." But so far I don't see any good way of computing a solution short of applying Newton's method (and keeping my fingers crossed). Even in the 1D case, asking Mathematica for a Gröbner basis results in a hairy mess!

Thanks. :-)

  • 0
    Ok, but still there are always a small number of solutions. It's surprising to me that these solutions are so hard to characterize.2011-11-20

1 Answers 1

1

In my opinion, the problem lies in the fact that your equations are not linear but polynomial. So the number of equations it takes to solve such a problem is not necessarily the number of unknowns. Indeed, a polynomial equation of degree $n$ in one variable has in general $n$ solutions. Even worse, polynomial equations with several variables may even have an infinite number of solutions. For example, take a look at your first problem.

$(ax - b)^2 = f$

Gives $ax - b = \sqrt{f}$ or $ax - b = -\sqrt{f}$. So from just two points, you get in general 4 different possibilities :

$a = \frac{\epsilon_1 \sqrt{f_1} - \epsilon_2 \sqrt{f_2}}{x_1-x_2}, \quad \quad b = \frac{x_2\epsilon_1 \sqrt{f_1} - x_1 \epsilon_2 \sqrt{f_2}}{x_1-x_2}$

with $(\epsilon_1)^2 = (\epsilon_2)^2 = 1$. However, if you evaluate at three different points, the polynomial is uniquely determined, so $(a,b)$ is unique up to a sign change (we cannot do better than this).

A similar problem appears for your second equation because

$(ax-u)^2 + (by-v)^2 = g$

has an infinite number of real solutions of the form $ax-u = \cos(\theta).\sqrt{g}$, $by - v = \sin(\theta).\sqrt{g}$ with $\theta \in \mathbb{R}$ arbitrary.

My suggestion if you want to solve this problem is to first expand your polynomial.

$g(x,y) = a^2 x^2 - 2 a u x + b^2 y^2 - 2 b v y + (u^2 + v^2) = A x^2 + B x + C y^2 + D y + E$

Compute $A, B, C, D, E$ by taking five suitable different points (for example, points of the form $(x_i, x_i^3)$ will yield a linear system with a unique solution). Then it is easy to get $a, b, u, v$ from $A, B, C, D, E \ $ (once again, $(a,u)$ and $(b,v)$ are unique only up to sign change).