I want to interpolate a function of $d$ variables over a Cartesian grid, using multivariate interpolation, while characterizing interpolation error in terms of bounds on partial derivatives of the original function.
I need such a theorem: Given a function $f(x,y)$ over the unit square and function $g(x,y) = (1-x)(1-y)f(0,0)+(1-x)yf(0,1)+x(1-y)f(1,0)+xyf(1,1)$ Then the error $|f(x,y)-g(x,y)|$ is upper bounded by something!
I know there are some results on generalization of divided deference to higher dimensions, but I need an error bound proportional to some power of grid size.
I managed to derive and prove some formulas for error of this Bilinear and also Trilinear case (2,3 dimensions) but I need a same theory for general multivariate case.
Could anyone introduce me a reference containing such theorems or mention the general form of this theorem?
Thanks!