5
$\begingroup$

I am trying to understand Hermite Interpolation. Here is my pedagogical example.

I want to approximate $f(x)=e^x$ on the domain $[-1,1]$ using Hermite interpolation.

I choose the Chebyshev zeros of $T_{n+1}(x)$ as the interpolating points, i.e. the points $x_0,x_1,\dots,x_n$, where, $$ x_i = \text{cos}\left(\left(k+\frac{1}{2}\right)\frac{\pi }{n+1}\right). $$

At each point $x_i$ I construct the list $f(x_i),f'(x_i),...,f^{(m)}(x_i)$ for $i=0,\dots,n$. That is I want to use not only the values of $f$ at the points $x_i$ but also its first $m$ derivatives at these points. The error in the approximation by the Hermite Interpolation Polynomial $P_n(x)$ is given by,

$$ E_{n}(x)=f(x)-P_{n}(x)=\frac{f^{(n+1)}(\xi_{x})}{(n+1)!}\prod_{k=0}^{n}(x-x_{i})^{m} $$

where where $\xi_x \in [-1,1]$. For $f(x)=e^x$ this can be easily bounded,

$$ \left|E_{n}(x)\right|\leq\frac{e}{(n+1)!}\prod_{k=0}^{n}(x-x_{i})^{m} $$

Of course if $m=0$ then the Hermite interpolating polynomial is just the Lagrange Interpolating polynomial. The problem occurs when I increase $m$. The number of derivatives to include in the interpolation procedure at each point $x_i$.

For example suppose $n=15$ and $m=10$, then according to the bound the error at $x=1$ should be less than 10^-35 (or something ridiculously small). However the resulting Hermite polynomial oscillates wildly and grows without bound in the interval $[-1,1]$. This reminds me much of Runges phenomenon, but this contradicts the error bound for the Hermite interpolating polynomial $P_n(x)$ and I am using the chebyshev zeros as the interpolating points. I observed the same behaviour when trying to approximate $tan(x)$ on a closed sub-interval of $(-\pi/2,\pi/2)$ and the inverse error function using Hermite interpolation.

Has anyone else observed this? Is this expected behaviour? Or have I overlooked something. As it stands it would "seem" Hermite interpolation is not very useful, but I would have expected it to be better than Lagrange interpolation. Have I missed the point?? Can anyone shed some light please??

Incidentally I build the Hermite Interpolating polynomials using Mathematica's built in function InterpolatingPolynomial[] as follows.

n = 15; m = 10;
x[k_] := N[Cos[(k + 1/2) \[Pi]/(n + 1)]]    
interpolationPoints = 
  Table[Join[{{N[x[j]]}}, Table[D[Exp[y], {y, k}] /. y -> x[j], {k, 0, m}]], {j, 0, n}] ;
hermiteP[x_] = InterpolatingPolynomial[interpolationPoints, x];
Plot[{Exp[x], hermiteP[x]}, {x, -1, 1}]
Plot[Exp[x] - hermiteP[x], {x, -1, 1}]

If I reduce the number of derivatives $m$ the result gets better. But I thought the result should get better as I increase $m$ not the other way around.

  • 0
    Are you aware that all derivatives of $\mathrm e^x$ are $\mathrm e^x$? (Your Mathematica code seems to be calculating them as derivatives.)2011-12-15
  • 0
    It might help if you add an image of the plot you get.2011-12-15
  • 0
    For the tangent and the inverse error function, it makes sense: no polynomial has ever had a vertical or horizontal asymptote. It stands to reason that polynomials will necessarily be a poor approximation to these functions.2011-12-15
  • 0
    Hi joriki, yes ofcourse i'm aware :) the code just lets me easily sub in other functions.2011-12-15
  • 1
    You say "of course" -- you might think otherwise when you've seen more questions on this site ;-)2011-12-15
  • 0
    I have found the cause, it seems that the Hermite Interpolation like other Interpolation methods most likely is unstable in the sense that if $\left(\tilde{x}_{i},\tilde{f}(\tilde{x}_{i})\right)$ are the perturbed values of $(x_i,f(x_i))$ say due to round off error or measurement then the resulting interpolating polynomials will wildly deviate from the true Hermite Interpolating polynomial. Especially when the Lebesgue constant is large. To remedy this increase the working precision in the second line of the code x[k_]=N[...,20]2011-12-15
  • 1
    [This book](http://books.google.de/books?id=zXnSxY9G2JgC&lpg=PA171&ots=pWlEBUFUSR&dq=hermite%20interpolation%20oscillations&pg=PA171#v=onepage&q=hermite%20interpolation%20oscillations&f=false) says "When there are a large number of nodes, the Hermite polynomials also exhibit oscillation weaknesses." The differences $x-x_i$ can be up to $2$ in your case, but that should be more than compensated with the small ones from nearby points, so it does seem the error bound should be small as you say. I suspect it might just be a rounding problem? Ah, I see you beat me to it :-)2011-12-15
  • 0
    @J. M. Note I said closed sub-interval of $(-\pi/2,\pi/2)$ on which $tan(x)$ is obviously continuously differentiable. And yes I agree no polynomial has these kinds of asymptote. But I wonder why you said this? As there Taylor series suggests, these functions can be approximated arbitrary well by polynomials.2011-12-15
  • 0
    @ joriki, Thanks for your input ... I don't know what people that dont know how to differentiate $e^x$ are doing here, lol ... Not sure what I should do with this question now. Is it worth keeping or should I vote to close ... maybe you could provide a formal answer I cud accept2011-12-15
  • 0
    I think it's worth keeping; others might come across this problem. You can write an answer yourself, if you like, and accept it.2011-12-15

1 Answers 1

4

I have found the cause, it seems that the Hermite Interpolation like other Interpolation methods most likely is unstable in the sense that if $\left(\tilde{x}_{i},\tilde{f}(\tilde{x}_{i})\right)$ are the perturbed values of $(x_i,f(x_i))$ say due to round off error or measurement then the resulting interpolating polynomials will wildly deviate from the true Hermite Interpolating polynomial (which of course is unique). Especially when the Lebesgue constant is large. To remedy this increase the working precision in the second line of the code above

x[k_] := N[Cos[(k + 1/2) \[Pi]/(n + 1)],20]