9
$\begingroup$

Assume that $f:{\bf R}\to{\bf R}$ is differentiable on ${\bf R}$, and both of $\lim\limits_{x\to\infty}f(x)$ and \lim\limits_{x\to\infty}f'(x) are finite. Geometrically, one may have \lim_{x\to\infty}f'(x)=0 Here is my question:

How can one actually prove it?

By definition, it suffices to show that

$\lim_{x\to\infty}\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}=0$

i.e. $\forall \epsilon>0~\exists M>0\quad \textrm{s.t.}\quad x>M\Rightarrow \left|\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}\right|<\epsilon$ For large enough $M$ and small enough $\epsilon$, one has $|f(x+h)-f(x)|<\tilde{\epsilon}$ But I have no idea how to go on.

  • 2
    This is a duplicate of http://math.stackexchange.com/questions/42277/limit-of-the-derivative-of-a-function-as-x-goes-to-infinity/42298#422982011-06-08

3 Answers 3

11

You are assuming that the limit of the derivative exists. Say \lim_{x\to\infty}f'(x) = L.

Case 1. $L\gt 0$. Then there exists $M\gt 0$ such that for all $x\geq M$, |f'(x) - L|\lt \frac{L}{2}. Therefore, for all $x\geq M$, \frac{L}{2} \lt f'(x) \lt \frac{3L}{2}. In particular, $f$ is increasing on $[M,\infty)$.

By the Mean Value Theorem, for each natural number $n$ there exists a $c$ (which depends on $n$), $M+n \lt c\lt M+n+1$ such that f(M+n+1)-f(M+n) = \frac{f'(c)}{(M+n+1)-(M-n)} = f'(c) \geq \frac{L}{2}. Inductively, we conclude that $f(M+n)\geq f(M)+\frac{Ln}{2}.$ But as $n\to\infty$, $f(M)+\frac{Ln}{2}\to \infty$; hence $f(x)\to\infty$ as $x\to\infty$, contradicting our assumption that $\lim\limits_{x\to\infty}f(x)$ is finite.

Case 2. $L\lt 0$; a similar argument shows that $f(x)\to-\infty$ as $x\to\infty$ in this case.

Therefore, the only possibility left is $L=0$, as desired.

2

The trick to getting geometric information out of the derivative is the mean value theorem. A corollary of the MVT is that, if m\leq f'(x)\leq M for all $x\in [a,b]$, then $m(b-a)\leq f(b)-f(a) \leq M(b-a)$.

So let's proceed to our proof of the the problem.

By subtracting off a constant, we can assume $\lim f(x)=0$. Let \lim f'(x)=L. Assume $L\neq 0$. Without loss of generality, $L>0$ Then for $N$ large enough, we have that $x>N$ implies that $|f(x)|<1$ and L/2.

If $N with $b-a>4/L$, then $f(b)-f(a) > L/2(b-a)> L/2(4/L)=2$, which is impossible.

0

This is a relatively old post, but interesting subject, so I thought I will post another solution, also based on the mean value theorem. Here we go:

$f(x)$ is a primitive for $f'(x)$, using fundamental theorem of calculus $\int_{M}^{M+r}f'(x)dx=f(M+r)-f(M)$ for $\forall M\in \mathbb{R}, \forall r>0$

Also (using mean value th), there $\exists \xi (M)\in \left [ M,M+r \right ]$ such that $\int_{M}^{M+r}f'(x)dx=f'(\xi (M))\left ( M+r-M \right )=f'(\xi (M))\cdot r$

So $f(M+r)-f(M)=f'(\xi (M))\cdot r$

Now, if $M \to \infty $ then $\xi (M) \to \infty $ and $r$ is const. As a result: $0=\lim\limits_{M \to \infty }\left [f(M+r)-f(M) \right ]=\lim\limits_{M \to \infty }f'(\xi (M))\cdot r$ $\Rightarrow \lim\limits_{M \to \infty }f'(\xi (M))=0$

Because $\lim\limits_{x \to \infty }f'(x)$ is finite then it must be $0$.