2
$\begingroup$

I'm currently facing the problem of fitting a parabolic 2-surface to a set of points in $\mathbb{R}^3$. As far as implementing a workable solution is concerned I could easily do this using one of the many available gradient descent optimizers. However I'm wondering if this can be better expressed through Quadratic Programming.

Let $\vec{p}_n \in \mathbb{R}^3$ the set of points to be fit, and let $\left(\mathbf{q},\vec{t},a,b\right) \in \left(\mathbb{H},\mathbb{R}^3,\mathbb{R},\mathbb{R}\right)$ parameters describing a parabolic surface ($\mathbf{q}$ being a quaternion describing its rotation and $\vec{t}$ its translation in $\mathbb{R}^3$ and $a,b$ quadratic scaling parameters), then the model function can be written as following

$r_n = P( \mathbf{M}^{-1} \vec{p}_n )$ where $\mathbf{M}=\begin{pmatrix}\mathbf{R}(\mathbf{q}) & \vec{t} \\ 0 & 1\end{pmatrix}$,

$\mathbf{R}$ being the mapping from quaternions into their equivalent $\mathbb{R}^{3\times3}$ rotation matrix and

$P(\vec{x})=\left\lVert\begin{pmatrix}x_0 & x_1\end{pmatrix}\begin{pmatrix}a \\ b\end{pmatrix}\right\rVert^2 - x_2$

Supplying this into a gradient descent solver, like Levenberg Marquardt or similar will approximate a solution for $\left(\mathbf{q},\vec{t},a,b\right)$.

I now wonder how to reformulate this into a Quadratic Programming problem.

  • 0
    Have you read pages such as: https://en.m.wikipedia.org/wiki/Quadratic_programming ?2017-02-03

1 Answers 1

0

Let $f : \mathbb R^2 \to \mathbb R$ be defined by

$$f (\mathrm x) := \mathrm x^{\top} \mathrm Q \, \mathrm x + \mathrm r^{\top} \mathrm x + s$$

where parameters $\mathrm Q, \mathrm r, s$ are to be estimated. Half-vectorizing the quadratic form, we obtain

$$\mathrm x^{\top} \mathrm Q \, \mathrm x = (\mathrm x \otimes \mathrm x)^{\top} \mbox{vec} (\mathrm Q) = (\mathrm x \otimes \mathrm x)^{\top} \mathrm D_2 \,\mbox{vech} (\mathrm Q)$$

where $\mathrm D_2$ is the $4 \times 3$ duplication matrix. Hence, $f$ can be written as follows

$$f (\mathrm x) = \begin{bmatrix} (\mathrm x \otimes \mathrm x)^{\top} \mathrm D_2 & \mathrm x^{\top} & 1\end{bmatrix} \begin{bmatrix}\mbox{vech} (\mathrm Q)\\ \mathrm r\\ s\end{bmatrix}$$

Given $n$ points $\{ (\mathrm x^{(1)}, y^{(1)}), (\mathrm x^{(2)}, y^{(2)}), \dots, (\mathrm x^{(n)}, y^{(n)}) \}$, we estimate the parameters using the following linear model

$$\underbrace{\begin{bmatrix} (\mathrm x^{(1)} \otimes \mathrm x^{(1)})^{\top} \mathrm D_2 & (\mathrm x^{(1)})^{\top} & 1\\ (\mathrm x^{(2)} \otimes \mathrm x^{(2)})^{\top} \mathrm D_2 & (\mathrm x^{(2)})^{\top} & 1\\ \vdots & \vdots & \vdots\\ (\mathrm x^{(n)} \otimes \mathrm x^{(n)})^{\top} \mathrm D_2 & (\mathrm x^{(n)})^{\top} & 1\end{bmatrix}}_{=: \mathrm X} \,\,\, \underbrace{\begin{bmatrix} \mbox{vech} (\mathrm Q)\\ \mathrm r\\ s\end{bmatrix}}_{=: \beta} = \underbrace{\begin{bmatrix} y^{(1)}\\ y^{(2)}\\ \vdots\\ y^{(n)}\end{bmatrix}}_{=: \mathrm y}$$

and solving the following quadratic program (QP) to find the least-squares estimate

$$\text{minimize} \quad \| \mathrm X \beta - \mathrm y \|_2^2$$

If $\mathrm X$ has full column rank, then the least-squares estimate is

$$\hat{\beta} := \left( \mathrm X^{\top} \mathrm X \right)^{-1} \mathrm X^{\top} \mathrm y$$