1
$\begingroup$

Suppose $f:\mathbb{R}\rightarrow\mathbb{R}$ is a $C^{2}$ function. Is true that for every $x, there exists $\theta\in(x,x+h)$, such that $$f(x+h)-f(x)=f'(x)h+\frac{1}{2}f''(\theta)h^2$$

Im studying optmization and the author uses this fact in $\mathbb{R}^{n}$, so i think that if i can prove this in $\mathbb{R}$, hence the adaptation is easy, but is that true?

Thanks

  • 0
    This looks like Taylor's Theorem to me. See http://en.wikipedia.org/wiki/Taylor%27s_Theorem2012-10-23
  • 0
    But in Taylor theorem we have the rest. What happened with the rest in this case?2012-10-23
  • 1
    @Tomás This is Taylor's theorem, indeed. The "remainder" is the term $$\frac 1 2 f''(\xi)h^2$$ Note that that is actually expanding" around $x$, that is $$f(h+x)=f(x)+f'(x)h+R_{x,2}(h)$$2012-10-23
  • 0
    @PeterTamaroff, thank you.2012-10-23
  • 0
    @Tomás Taylor's _theorem_ is distinct from Taylor series. The term $\frac{1}{2}f''(\theta)h^2$ encapsulates the error term in the Taylor polynomial.2012-10-23
  • 0
    I had never paid enough attention to this rest. Now i understand. Thanks all2012-10-23

1 Answers 1

2

This is just the Lagrange form of the remainder for Taylor's Theorem.