Working with a homework problem where I'm to derive an estimation of the interpolation error, and compare it with the actual error. This part is ok and I'm done with it. But while working with this in Matlab I've started to wonder about something which I can't figure out (yet).
If we interpolate the polynomial $f(x)=c_3x^3+c_2x^2+c_1x+c_0$ for $x\in[a,b]$ with i.e. a Newton polynomial of degree two, $p_2(x)=b_2x^2+b_1x+b_0$ (using $x=[x_0,x_1,x_2]$ in the interpolation), we can estimate the interpolation error by
$|f(x)-p_2(x)|=\left|\frac{f^{(3)}(\xi)}{(3)!}\prod_{i=0}^2(x-x_i)\right|$.
I've implemented some code for both the Newton interpolation polynomials in Matlab, together with this error estimate. Now, when I make a plot of the error estimate it becomes surprisingly equal to the actual error $|f(x)-p_2(x)|$, so equal that I assume it's just the numerical precision that causes the deviation.
Will it always be like this, that the two errors will be equal, when you interpolate a polynomial of degree $n$ with a interpolating polynomial of degree $n-1$?
For my problem, the actual error will be on the form $e(x) = c_3x^3-(c_2-b_2)x^2+(c_1-b_1)x+(c_0-b_0)$, whereas the error estimate will be on form $\hat{e}(x)=c_3(x-x_0)(x-x_2)(x-x_3)$. Is it just "luck" that the coefficients in the actual error are "equal" to the coefficients in the estimated error? Is this something that can be proved, or?