This is actually a nice exercise. (In fact, if I recall correctly, it was given as a problem on the very first math exam I took in college. Unfortunately all I was able to say was that it was true if f' was assumed to be continuous, for which I received zero credit.)
Let me set it up a little bit and leave the rest to the interested readers: it is easy to reduce the general case to the following: suppose that f'(a) > 0 and f'(b) < 0. Then there exists $c \in (a,b)$ with f'(c) = 0.
Here's the idea: an interior point with f'(c) = 0 is a stationary point of the curve (and conversely!). In particular the derivative will be zero at any interior maximum or minimum of the curve. Recall that since $f$ is differentiable, it is continuous and therefore assumes both a maximum and minimum value on $[a,b]$. So we're set unless both the maximum and minimum are attained at the endpoints. Perhaps the sign conditions of f' at the endpoints have something to do with this...