2
$\begingroup$

Let $f(x)$ be two times differentiable in $[a, b]$. The Mean Value Theorem tells us that there's such a point $a\lt c\lt b$, such that {f}'(c)=\frac{f(b)-f(a)}{b-a}.

If \forall x \in [a, b], {f}''(x)\gt 0 and f''(x) is strictly increasing then $c\gt \frac{a+b}{2}$.

I'm having a hard time figuring this one out. More so, I can't seem to understand why this is true, intuitively.

I tried the following: Write $f(x)$ Taylor's expansion of degree 1, around $a$. Then, for $x=b$ we get: f(b)-f(a)=f'(a)(b-a)+\frac{1}{2}f''(\xi)(b-a)^2 \implies f'(c)=f'(a)+\frac{1}{2}f''(\xi)(b-a)

Now I still haven't used the fact that f''(x) is positive and strictly increasing in $[a, b]$ (which btw also implies that f'(x) is strictly increasing). I tried playing with the expression above but I just don't see which direction to go.

Please don't give a full solution. Hints and intuition are appreciated.

3 Answers 3

1

Try sketching a graph, since you are looking for intuition only.

A sketch of the f' will be easier as then you can use the Intermediate Value Theorem too.

Say we want to see that the MVT holds for the case of $f(x)$ in [a,b]

We need to prove that the slope $\frac{f(b)-f(a)}{b-a}$ is inside [f'(a),f'(b)] This is not too difficult to prove.

  • 0
    Alright, say that we can prove that the slope at point $a$ is $A$, and at point $b$ is $B$. what is the slope from point $A$ to $B$? Assume that function you have does not cross your line $(a,A)$ to $(b,B)$. If it does, then just pick the first point it crosses, (since the slope is the same). Now, we have a function from $(a,A)$ to $(b',B')$. And we must have that $\frac{B'-A}{b'-a}$ is strictly between $f'(b')$ and $f'(a)$. And we are done.2010-12-20
0

I prefer words, really. What the mean value theorem tells us is that at some point on the interval $[a,b]$ the function $f(x)$ (continuous and differentiable on this interval) has a gradient equal to the slope of the line connecting $a$ and $b$.

Why should this be? Well, the trivial case we can handle, namely that if $f\prime(x) = k$ some constant and passes through $a,b$ then k can only be that slope.

So let's take your case. $f\prime\prime(x) > 0$ and strictly increasing, so the rate of change of gradient is positive and going up. So this means the gradient is increasing in increasingly big chunks.

Sketch what that means, then take the line from $a$ to $b$ and find out where it hits your sketched curve. Why can it not appear before $\frac{a+b}{2}$?

To ask a different question - the rate of change of gradient is strictly increasing. Given then that the gradient between $a,b$ gets bigger in bigger chunks, where do you expect the gradient to change most between? If I said compare this to something like a probability distribution, would it make sense why it's called the mean value theorem?

Does that help at all?

0

After changing $f$ by a suitable linear function you may assume f(a)=f(b)=f'(c)=0. Expanding $f$ at $c$ by Taylor's theorem you get f(x)=f(c)+{1\over2}(x-c)^2 f''(\xi) where $\xi$ lies between $c$ and $x$. Now put $x:=a$ and $x:=b$ and draw conclusions.