2
$\begingroup$

Doing some exercise on continuity and differentiability I was wondering if there is some connection between a function's limits at $\pm \infty$ and the limit of its derivative? To be more formal suppose we have a real valued continue and differentiable $n$ times function $f$. We know the limits of the function at $\pm \infty$, can we say something about the limits at $\pm \infty$ of its derivative $f'$ in the following cases:

1) The limit is finite say $l \in \mathbb R$;

2) The limit is infinite (I guess the sign doesn't matter in this case);

3) We additionally know that a function is convex/concave and we are in one of the 2 cases already mentioned.

Any example/counterexample or source is greatly appreciated.

3 Answers 3

3

Let $g(x)=(1-x^2)^2$ if $x\in[-1,1]$ and $0$ for $|x|>1$. Show that $g(x)$ is differentiable and, in fact, the derivative is continuous. $g(x)$ can be seen as a differentiable "bump" that starts rising at $x=-1$, reaching a peak value of $1$ at $x=0$, and returning to $0$ at $x=1$.

Now, define $f(x)$ as:

$$f(x)=\sum_{k=1}^{\infty} \frac{1}{k}g(k^2(x-k))$$

$f(x)$ can be seen as a function which is mostly zero except with bumps with a maximum of $1/k$ in the intervals $(k-1/k^2,k+1/k^2)$, where $k$ is a positive integer.

Claim: $\lim_{x\to\infty} f(x) = 0$ and $\lim_{x\to\infty} f'(x)$ does not exist.

Proof: Now, $f(k)=\frac{1}{k}$ for all positive integers $k$, and if $x>k$, then $0\leq g(x)<\frac{1}{k}$. This shows that $f(x)\to 0$ as $x\to\infty$.

Also $f(k-1/k^2)=0$. By the mean value theorem, this means that for some $c\in (k-1/k^2,k)$, $f'(c)=\frac{f(k)-f(k-1/k^2)}{1/k^2} = k$. This shows that $f'(x)$ does not converge to zero. In fact, it is not even bounded.

However, if $f(x)$ converges to a finite value, and $f'(x)$ converges at all, then $f'(x)$ converges to zero. You can prove this again with the mean value theorem. See the proof later in the answer.

If $f(x)$ is concave or convex, then $f'(x)$ is monotonic. That means that if $f'(x)$ does not converge to any value, then for some $N$, we have $f'(x)>1$ for all $x> N$ or $f'(x)<-1$ for all $x>N$. This is easily shown to not be possible if $f(x)\to L$, since it would mean that $|f(x+1)-f(x)|>1$ for $x>N$.

This means that $f'(x)$ converges, and thus it must converge to $0$.


Originally, I wrote that it was true if $f(x)$ was monotonic, but I don't think that's so.

If we define $h(x)=\sum_{k=1}^{\infty} g(k^3(x-k))$ and $f(x)=\int_0^x h(t)\,dt$ then $f'(x)=h(x)$ does not converge to zero, but $f(x)$ is monotonically increasing, and $f(x)$ is bounded by $2\sum_k \frac{1}{k^2}$, so $f(x)$ converge, but $f'(x)$ does not converge.

We could even make $h(x)$ unbounded by making it $h(x)=\sum_{k=1}^{\infty} kg(k^4(x-k))$.

If you want a counterexample with $f$ strictly increasing, use $1-e^{-x} + \int_{0}^x h(t)\,dt$.


One last thing I didn't prove: If $f'(x)$ and $f(x)$ both converge to finite values, then $f'(x)$ converges to $0$.

The key is the result:

Lemma: If $f$ is differentiable and $\lim_{x\to\infty} f(x)=L$, then $\liminf_{x\to\infty} |f'(x)|=0$.

Proof:

This follows from the mean value theorem.

Since $f(x)\to L$, here is an $N$ so that for $x>N$, $|f(x)-L|<1/2$. Thus, for $x>N$, and any $D>0$, $|f(x+D)-f(x)|<1$.

Now, given any $\epsilon>0$ and any $M$ pick $x>\max(M,N)$ and set $D=\frac{1}{\epsilon}$.

By the mean value theorem, there must be a $c\in[x,x+D]$ such that $$f'(c)=\frac{f(x+D)-f(x)}{D}=\epsilon(f(x+1/\epsilon)-f(x))$$ and thus $|f'(c)|<\epsilon$.

So $\liminf_{x\to\infty} |f'(x)|=0$.

Corollary:

If $f(x)$ and $f'(x)$ converge to finite values as $x\to\infty$, then $\lim_{x\to\infty} f'(x)=0$.

Proof: Since $\lim_{x\to\infty} |f'(x)|=|\lim_{x\to\infty} f'(x)|$, and $\liminf |f'(x)|=\lim |f'(x)|$ when the right hand exists, you have the result.


Finally, if $f(x)\to\infty$ you can get any values for $\alpha=\liminf |f'(x)|$. For example, $f(x)=\alpha x$ for $\alpha\neq 0$ give $f'(x)=\alpha$. $$f(x)=\begin{cases}\log x&x\geq1\\ x-1&x<1\end{cases}$$

has $f(x)\to\infty$ as $x\to\infty$ but $f'(x)\to 0$.

For concave $f$, you always have $\lim f'(x)$ in $[-\infty,\infty]$, but without concavity, you can get any values $\alpha\leq\beta$ for $\alpha=\liminf f'(x)$ and $\limsup f'(x)$.

  • 1
    I hope I don't face such functions in the future.2017-01-03
  • 1
    Yeah, analysis counterexamples are really unpleasant.2017-01-03
  • 0
    What else can I say? Thank you a lot! Very interesting answer, helped me clear my mind a lot!2017-01-03
2

Generally, no.

  1. $\sin(x^2)/x$

  2. $x + \sin(x)$

The graphs of these make this obvious.

0

$1)$ consider a sine function with decreasing amplitude and increasing frequency. Also $arc tan x $. So nothing can be said accurately about limit of derivative. $2)$ consider $x^x$, and identity function. Again nothing can be said about limit of derivative.

I have no idea about third case.