2
$\begingroup$

If $f$ is continuous on $[a,b]$ and f\;'> 0 on $(a,b)$, then $f$ is monotonically increasing on $[a,b]$ and proof follows from mean value theorem. Now suppose we have $f \in C[a,b]$ and f\;'(x+)> 0 for all $x \in (a,b)$. Does it follow that $f$ is monotonically increasing on $[a,b]$?

1 Answers 1

1

Suppose that $a with $f(u)>f(v)$. Let $C=\{x\in[u,v]:f(x)=f(u)\}$; $f$ is continuous, so $C$ is a closed subset of $[u,v]$ and therefore has a maximum element. Without loss of generality we may assume that this maximum element is $u$. The intermediate value theorem then ensures that $f(x) for all $x\in(u,v]$. But 0 so there is a $y\in(u,b)$ such that $f(x)>f(u)$ for every $x\in(u,y)$. This contradiction shows that $f$ must be monotone increasing.

  • 0
    @Tigran: You’re very welcome.2011-11-02