13
$\begingroup$

I'm working on a problem in baby Rudin, Chapter 5 Exercise 14 reads:

Let $f$ be a differentiable real function defined in $(a,b)$. Prove that $f$ is convex if and only if f' is monotonically increasing.

I am trying to prove that f' is monotonically increasing under that assumption that $f$ is convex. I have written a proof, but a friend and I do not agree on the validity of my argument. Here is my argument.

First, assume that $f$ is convex. Since $f$ is differentiable and real on $(a,b)$, $f$ is continuous on $(a,b)$. So, for $a, by the mean value theorem there exist points $y_1\in[s,u]$ and $y_2\in[v,t]$ such that f'(y_1)=\frac{f(u)-f(s)}{u-s}\quad\text{and}\quad f'(y_2)=\frac{f(t)-f(v)}{t-v}.

Then, by exercise 23 of chapter 4 (proven previously): $ \frac{f(u)-f(s)}{u-s}\leq\frac{f(t)-f(v)}{t-v} $ or, f'(y_1)\leq f'(y_2). Hence, f' is monotonically increasing. $\hspace{3.5in}\square$

My friend claims that this merely proves that for any two arbitrary intervals $[s,u]$ and $[v,t]$ there are points that satisfy f'(y_1) \leq f'(y_2). He says that what I need to prove is that for any two arbitrary points $y_1\leq y_2$ we have f'(y_1)\leq f'(y_2). (I should mention that he doesn't know how one would do this.)

Is he right? Is my proof insufficient? If so, how can I fix it?

  • 1
    +1 for showing us your work and asking a clear question.2011-11-14

1 Answers 1

6

Your friend is right.

From the previously solved exercise, you can show that for arbitrary $s in $(a,b)$, $\lim\limits_{u\to s+}\frac{f(u)-f(s)}{u-s}\leq\lim\limits_{v\to t-}\frac{f(t)-f(v)}{t-v}$, so f'(s)\leq f'(t).