Let $f \in C^2([a,b])$. How can I show that there has to exist a $C>0$ such that for all $x,y \in [a,b]$ we have $f'(x) - f'(y) \leq C(x-y)$ using Taylor-expansion (and the Lagrange-formula)?
Taylor: $f'(x) - f'(y) \leq C (x-y)$
0
$\begingroup$
real-analysis
functions
taylor-expansion
-
0You need some absolute values in that. – 2017-02-07
-
0You may add as many absolute values as you need to make this question as senseful as possible. – 2017-02-07
-
0using Taylor-expansion need $f\in C^\infty$ – 2017-02-07
-
0No worries, let $f$ be C-infinity. – 2017-02-07
-
0Let $C= \sup_{x \in [a,b]} |f''(x)|$ and upper bound the Taylor remainder, assuming that $x
– 2017-02-07