Given $f:(0,\infty) \rightarrow \mathbb{R}$ and f''(x)>0\forall x \in (0,\infty). Is it correct to say that f'(x) is a monotonically increasing function? Can I correctly assume that for any $a,b \in (0,\infty)$ f'(a) > f('b) if $a > b$?
Is $f'(x)$ monotonically increasing if f''(x) >0?
3
$\begingroup$
calculus
2 Answers
2
Mean value theorem says yes. For each $a>0$, $b>0$ there exists $c\in(a,b)$ with $f'(b)=f'(a)+f''(c)(b-a)$, since $f''(x)$ is positive for all $x>0$ we get that $f'(b)\ge f'(a)$ which gives us the required statement.
-
0thanks $f$or the explanation. – 2011-01-05
2
Yes. Set g=f'. Then g' = f'' > 0 and monotonicity of g=f' follows. (By the way: $f$ is convex.)