Suppose $f:\mathbb{R}\rightarrow\mathbb{R}$ is a $C^{2}$ function. Is true that for every $x Im studying optmization and the author uses this fact in $\mathbb{R}^{n}$, so i think that if i can prove this in $\mathbb{R}$, hence the adaptation is easy, but is that true? Thanks
Mean Value Theorem for the Second Derivative
1
$\begingroup$
analysis
-
0This looks like Taylor's Theorem to me. See http://en.wikipedia.org/wiki/Taylor%27s_Theorem – 2012-10-23
-
0But in Taylor theorem we have the rest. What happened with the rest in this case? – 2012-10-23
-
1@Tomás This is Taylor's theorem, indeed. The "remainder" is the term $$\frac 1 2 f''(\xi)h^2$$ Note that that is actually expanding" around $x$, that is $$f(h+x)=f(x)+f'(x)h+R_{x,2}(h)$$ – 2012-10-23
-
0@PeterTamaroff, thank you. – 2012-10-23
-
0@Tomás Taylor's _theorem_ is distinct from Taylor series. The term $\frac{1}{2}f''(\theta)h^2$ encapsulates the error term in the Taylor polynomial. – 2012-10-23
-
0I had never paid enough attention to this rest. Now i understand. Thanks all – 2012-10-23
1 Answers
2
This is just the Lagrange form of the remainder for Taylor's Theorem.