I remember being assigned the following homework problem a few years back.
Let $f:[0,1] \to \mathbb{R}$ be continuously differentiable. Prove that, for every $\epsilon > 0$, there exists $\delta > 0$ such that $0 < |h| < \delta$ implies $\left| \frac{f(x+h) - f(x)}{h} - f'(x) \right| < \epsilon$ for all appropriate $x$.
I also remember how I solved it.
Fix $\epsilon > 0$. Since $f'$ is a continuous function on a compact interval, it is uniformly continuous and we can find a $\delta > 0$ such that $|x-y| < \delta$ implies $|f'(x) - f'(y)| < \epsilon$ for $x ,y \in [0,1]$. Suppose $0 < |h| < \delta$ and that $x$ is such that $x,x+h \in [0,1]$. By the Mean Value Theorem, there is an $a$ between $x$ and $x+h$ such that $f'(a) = \frac{f(x+h) - f(x)}{h}$. Note $|x - a| < \delta$ clearly holds. So, $\left|\frac{f(x+h) - f(x)}{h} - f'(x) \right| = |f'(a) - f'(x)| < \epsilon$ and we are finished.
What had me baffled was that this: we had not covered the MVT at the time the problem was assigned! This suggests there should be a way to prove it without using the MVT. Can anybody think of a way? I don't think I ever did. Thanks.