1
$\begingroup$

How do I prove this? We don't have that strictly increasing implies f' > 0, but we do have the converse; and, things like Rolle's Theorem and statements of the mean value theorem, and that f' exists implies that $f$ is continuous:

Suppose f' exists on $(a, b)$ and f'(x) \neq 0 on $(a, b)$. Prove that either f'(x) > 0 for all $x \in (a, b)$ or f'(x) < 0.

My attempts don't seem rigorous.

  • 0
    @totok: Simply posting a question as if you were assigning homework is considered rude by many readers (myself included); putting context is much better.2011-10-24

3 Answers 3

4

I'm assuming the conclusion is supposed to be that f'(x) is either always positive or always negative (rather than $f(x)$), since the conclusion is false for $f(x)$ (If it were true for a function $f$, then doing a suitable vertical translation you would be able to get a function which is sometimes positive and sometimes negative on $(a,b)$, without changing the derivative.

Assume there are points $d,e$ with$ a\lt d\lt e\lt b$ and f'(d)\lt 0 \lt f'(e).

By the Extreme Value Theorem, $f(x)$ achieves a maximum and a minimum on $[d,e]$, and this minimum is achieved at a critical point or at an endpoint. By hypothesis, there are no critical points, so the minimum must be achieved at $d$ or at $e$.

Show that the fact that f'(d)\lt 0 implies that $f$ cannot achieve its minimum over $[d,e]$ at $d$. One can do this by using the definition of derivative as a limit.

Added. Here is how one can prove this without trying to show that $f$ is decreasing on a neighborhood of $d$: by definition, \lim_{h\to 0}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0. In particular, \lim_{h\to 0^+}\frac{f(d+h)-f(d)}{h} = f'(d)\lt 0. By definition of the limit, for every $\epsilon\gt 0$ there exists $\delta\gt 0$ such that if $0\lt h\lt \delta$, then \left|\frac{f(d+h)-f(d)}{h} - f'(d)\right|\lt \epsilon. Take \epsilon= |f'(d)/2, and let $\delta_0$ be the corresponding $\delta$. Then for any $h$, $0\lt h\lt \delta_0$ we have \left|\frac{f(d+h)-f(d)}{h} - f'(d)\right| \lt \frac{|f'(d)|}{2}. From this, we conclude that $\frac{f(d+h)-f(d)}{h}\lt 0$. Since $h\gt 0$, that means that $f(d+h)-f(d)\lt 0$, or $f(d+h)\lt f(d)$. This holds for all $h$ with $0\lt h\lt \delta_00$. That means that $f(d)$ is strictly larger than all values of $f$ on $(d,d+\delta_0)$, so $f(d)$ cannot be the minimum on $[d,d+\delta_0)$, and so cannot be the minimum on $[d,e]$.

Likewise, show that the fact that f'(e)\gt 0 implies that $f$ cannot achieve its minimum over $[d,e]$ at $e$. This contradiction arises from the assumption that we can find points $d$ and $e$ with f'(d)\lt 0 \lt f'(e). Replace $f$ with $-f$ to show you cannot find points $d$ and $e$ with f'(d)\gt 0 \gt f'(e) either.

  • 0
    @totok: "Defined on a neighborhood of $d$" does not imply continuity. And we don't need to *assume* it's defined on a neighborhood of $d$. **We are told** that $f$ is differentiable on all $(a,b)$. So it is *certainly* defined on a neighborhood of $d$: namely, $(a,b)$. *But you don't need that* to conclude that $f(d)$ cannot be a minimum: just the derivative being negative at that point is more than enough.2011-10-24
2

As ever, Arturo's response is right on the money. But since there seem to be some doubts / lack of approval for Arturo's answer, let me offer a second (totally corroborating) opinion.

The theorem in question is called Darboux's Theorem, or sometimes "The Intermediate Value Theorem for Derivatives". It states that for any differentiable function $f$, $f'$ has the intermediate value property, even though it need not be continuous. This result, although simple to prove, is rather surprising.

As it happens, I proved this result in my "Spivak calculus" class last week, so the statement and proof can be found on page 20 of these notes. The proof uses the Extreme Value Theorem, and in fact it differs in no essential way from the argument Arturo gives in his answer. This is a standard thing...

-2

Use intermediate value property for derivative. If $f'$ is negative at one point and positive at another, then at some point in between $f'$ will take the value $0$ which contradicts the hypothesis.

  • 1
    Don't use capital letters, PLEASE!2014-11-25