If $f$ is continuous on $[a,b]$ and f\;'> 0 on $(a,b)$, then $f$ is monotonically increasing on $[a,b]$ and proof follows from mean value theorem. Now suppose we have $f \in C[a,b]$ and f\;'(x+)> 0 for all $x \in (a,b)$. Does it follow that $f$ is monotonically increasing on $[a,b]$?
monotonicity of a function which is right differentiable
2
$\begingroup$
calculus
real-analysis
1 Answers
1
Suppose that $a with $f(u)>f(v)$. Let $C=\{x\in[u,v]:f(x)=f(u)\}$; $f$ is continuous, so $C$ is a closed subset of $[u,v]$ and therefore has a maximum element. Without loss of generality we may assume that this maximum element is $u$. The intermediate value theorem then ensures that $f(x)
-
0@Tigran: You’re very welcome. – 2011-11-02