Let $f(x)$ be two times differentiable in $[a, b]$. The Mean Value Theorem tells us that there's such a point $a\lt c\lt b$, such that {f}'(c)=\frac{f(b)-f(a)}{b-a}.
If \forall x \in [a, b], {f}''(x)\gt 0 and f''(x) is strictly increasing then $c\gt \frac{a+b}{2}$.
I'm having a hard time figuring this one out. More so, I can't seem to understand why this is true, intuitively.
I tried the following: Write $f(x)$ Taylor's expansion of degree 1, around $a$. Then, for $x=b$ we get: f(b)-f(a)=f'(a)(b-a)+\frac{1}{2}f''(\xi)(b-a)^2 \implies f'(c)=f'(a)+\frac{1}{2}f''(\xi)(b-a)
Now I still haven't used the fact that f''(x) is positive and strictly increasing in $[a, b]$ (which btw also implies that f'(x) is strictly increasing). I tried playing with the expression above but I just don't see which direction to go.
Please don't give a full solution. Hints and intuition are appreciated.