0
$\begingroup$

Let $f \in C^2([a,b])$. How can I show that there has to exist a $C>0$ such that for all $x,y \in [a,b]$ we have $f'(x) - f'(y) \leq C(x-y)$ using Taylor-expansion (and the Lagrange-formula)?

  • 0
    You need some absolute values in that.2017-02-07
  • 0
    You may add as many absolute values as you need to make this question as senseful as possible.2017-02-07
  • 0
    using Taylor-expansion need $f\in C^\infty$2017-02-07
  • 0
    No worries, let $f$ be C-infinity.2017-02-07
  • 0
    Let $C= \sup_{x \in [a,b]} |f''(x)|$ and upper bound the Taylor remainder, assuming that $x2017-02-07

0 Answers 0