If $f$ is continuous on $[a,b]$ and $f\;'> 0$ on $(a,b)$, then $f$ is monotonically increasing on $[a,b]$ and proof follows from mean value theorem. Now suppose we have $f \in C[a,b]$ and $f\;'(x+)> 0$ for all $x \in (a,b)$. Does it follow that $f$ is monotonically increasing on $[a,b]$?
monotonicity of a function which is right differentiable
2
$\begingroup$
calculus
real-analysis
1 Answers
1
Suppose that $af(v)$. Let $C=\{x\in[u,v]:f(x)=f(u)\}$; $f$ is continuous, so $C$ is a closed subset of $[u,v]$ and therefore has a maximum element. Without loss of generality we may assume that this maximum element is $u$. The intermediate value theorem then ensures that $f(x)
-
1@Tigran: I don’t think that you understand the argument. I’m not using the mean value theorem anywhere. Of course $f$ is constant on $C$; so what? $C$ is used merely to find its maximum, $u$, so that I have an interval $(u,v]$ on which $f(x)
– 2011-11-02 -
0Now I got your argument. Thank you for your nice solution. – 2011-11-02
-
0@Tigran: You’re very welcome. – 2011-11-02