13
$\begingroup$

Given that $f$ is differentiable, what does $\lim\limits_{x \to \infty} f(x) = 1$ say about $\lim\limits_{x \to \infty} f^\prime(x)$ ? Intuitively I feel that it's $0$.

I attempted to solve this by trying to evaluate $$\lim_{x \to \infty} \lim_{h \to 0} \frac{f(x + h) - f(x)}{h} $$ by trying to interchange the limits after showing that $\frac{f(x + h) - f(x)}{h}$ converges uniformly as $x \to \infty$. But I couldn't proceed further.

Trying to work backwards, as a specific example, $f(x) = \arctan(x)$ came to my mind. It's derivative certainly goes to $0$ as $x \to \infty$. Doesn't this show that $\frac{f(x + h) - f(x)}{h}$ converges uniformly to $0$ as $x \to \infty$ ?

I'm totally confused! I would really appreciate if anyone told me what I am doing wrong.

  • 0
    to interchange the limits you need the uniform convergence and I see that you are interchanging to prove the uniform convergence... I am not clear on your proof2011-04-29
  • 0
    @El I couldn't see how I could show uniform convergence. So I tried to work backwards using a specific example to see if uniform convergence would come up.2011-04-29

2 Answers 2

7

This is in response to Balaji's comment.

Lemma. Let $f\colon (a,+\infty)\to\mathbb{R}$ be a bounded $C^1$ function such that $\lim_{x\to+\infty}f'(x)=\delta$ exists. Then $\delta=0$.

Proof. We proceed by contradiction and assume that $\delta\ne0$. Suppose first that $\delta>0$. Then there exists $x_0>a$ such that $f'(x)\ge\delta/2$ for all $x\ge x_0$. Then $$ f(x)=f(x_0)+\int_{x_0}^xf'(t)dt\ge f(x_0)+\frac{\delta}{2}(x-x_0)\quad\forall x\ge x_0. $$ In particular $f$ is unbounded. The case $\delta<0$ is done similarly.

If you do not want to use integrals, you may reason as follows. Let $$g(x)=f(x)-f(x_0)-\frac{\delta}{2}(x-x_0).$$ Then $g(x_0)=0$ and $g'(x)\ge 0$ for all $x>x_0$. It follows that $g$ is increasing in $(x_0,+\infty)$ and $g(x)\ge 0$ for all $x>x_0$.

24

Nothing. you can take something like: $x\mapsto \frac{\sin(x^2)}{x}+1$.

  • 13
    This is a correct example; however, perhaps it is not so obvious how he arrived at this result. The idea is that, as $x$ becomes larger and larger, you can force the function to be with a very small band around $y=1$, but you still have the ability to make the function jump up and down from $1-\varepsilon$ to $1+\varepsilon$, so that amplitude of the oscillations become smaller and smaller, so that the function converges (in this case to $1$), but the oscillation never goes away, so that the derivative does not converge. Hope this helps with the intuition here!2011-04-29
  • 4
    The counter-example can be intuitively found by thinking of a sinusoid that, as x increases, decreases its amplitud as well as its period -in the same amount, say: `f(x) = g(x) sin(x / g(x))` with `g(x)` decreasing to zero, in particular: `g(x) = 1/x`. Then, `f(x)` tends to zero, but (because it is always a scaled down version of itself) its derivative keeps oscilating. The2011-04-29
  • 0
    @GleasSpty, @leonbloy, Thanks! It seems simple in retrospect. But I got so tied up with trying to use uniform convergence that I couldn't see anything else.2011-04-29
  • 7
    It should be mentioned that _if_ $\lim_{x\to\infty}f'(x)$ exists, then the limit must be $0$.2011-04-29
  • 0
    @Julián Thanks! After seeing the answers above I had reached this conclusion, but was waiting for someone to say it. But how can one show it ? If you can make an answer out of it, I'll be very glad.2011-04-30