2
$\begingroup$

Imagine you are given a set of data points $\{x_i,y_i\}$, supplemented by a list of known first derivatives \{y'_i\}.

How would you construct an interpolating function $y(x)$ (which satisfies $y(x_i)=y_i$ and $y'(x_i)=y'_i$), such that the derivative, calculated from this function has the smallest error.

The function is expected to be well behaved, consisting of piece-wise power functions, smoothely linked.

What would you do if you knew second derivatives as well and wanted to work with them instead of the first ones?

  • 0
    I have found a relevant question here: http://math.stackexchange.com/questions/91760/hermite-interpolation-of-ex-strange-behaviour-when-increasing-the-number-of. It follows that actually Hermite interpolation is not oscillatory, if you use the tables {x_i,y_i,y'_i,y''_i} with sufficient numerical accuracy. Therefore the question is solved. Thanks to everybody for useful comments.2012-02-28

2 Answers 2

3

To settle this question: as already mentioned in the comments: one of the simplest approaches to this problem is (piecewise) Hermite interpolation (of which splines are a special case, since they have extra restrictions that guarantee continuity of their higher derivatives). In particular, there is an easy modification of the usual divided difference scheme to produce the unique polynomial that satisfies the function and derivative values supplied.

On the other hand, (polynomial) Hermite interpolation is not the be all and end all of interpolating given function values and derivatives. For instance, there are circumstances where a rational Hermite interpolant (a rational function that satisfies a given set of function and derivative values) might be more appropriate, when the function being approximated is expected to have poles or similar behavior, and if the function whose values are given is expected to be periodic, then certainly one needs to look into trigonometric Hermite interpolation. The computations needed for these variants are slightly more intricate than in the polynomial case, but if they give better results ("better" being up to your criteria of course), then the additional bother is certainly justifiable.

  • 0
    Thank you, sir, especially for the good references to the theory!2012-03-19
2

This answer is a pitch for the 'Weak Solution Techniques' especially intended for the case where your sampled data might have noise (like in the real world). You could choose just about any basis: $[\phi_k(x)| k \in 0:N]$ (one of my favorites is B-splines). The great part about such a basis is that you know the derivatives of all the basis vectors in advance.

In this approach you would represent $y(x) \approx f(x;a) = \sum_{k=0}^{N} a_k \phi_k(x)$ and seek to find the best coefficients $a_k$ in the sense that they minimize an energy functional. I usually take a variational approach to this problem - this way we can accommodate any noise that might be in the data $y_k$. In particular, we can know exactly how well we are are approximating the data by setting up a cost functional such as:

$ J(a) = \Bigl\|y-\sum_{k=0}^{N} a_k \phi_k(x) \Bigr\|_2^2 + \beta_1\biggl\|\frac{dy}{dx}-\sum_{k=0}^{N} a_k \frac{d\phi_k}{dx} \biggr\|_2^2 + \cdots $

the dots are because you could keep on adding penalties for each kind of information that you have. At this point you need to enforce the necessary condition for extrema: namely, for each $k$, $\frac{\partial J(a)}{\partial a_k} = 0$

Lets call the matrix of sampled basis functions $[\phi_0(x_k), \phi_1(x_k), \phi_2(x_k),\cdots ] =\Phi$ and similarly $\bigl[\frac{d\phi_0}{dx},\frac{d\phi_1}{dx},\frac{d\phi_2}{dx},\cdots \bigr] = d\Phi$ then the necessary condition yields the Least Squares style equation:

$(\Phi^*\Phi+\beta_1d\Phi^*d\Phi)a = \Phi^*y + \beta_1d\Phi^*\frac{dy}{dx}$ where $\Phi^*$ and $d\Phi^*$ are the adjoints and $y$ and $dy/dx$ are the sampled data.

There is a lot of literature on how to choose the regularization parameter $\beta_1$ as well as other kinds of penalties (Tikhonov, $L_1$, TV) that you might add (as Bayesian priors)

  • 0
    Big thank you! Very nice o$f$ you to share these references!2014-05-16