I have a real, smooth, multivariate (with 10 variables or many more) function, for which I have the exact Jacobian and Hessian. It turns out that unless the norm of the increment of the function is very small the linear approximation (just with the Jacobian) is usually better than the quadratic one (with also the Hessian).
I guess this is due to the radius of convergence of the Taylor series being very small (although in several variables it is a domain rather than a radius of convergence).
I'm using this for trust-region optimization. In the books, however, they don't mention this possible issue. On the other hand, one rarely has the exact Hessian. I do but cannot benefit from it. Anybody could provide some thoughts or links? For instance I'd like to estimate a good error bound for the quadratic approximation.
Thank you very much in advance for your time.