5
$\begingroup$

I conjecture that in some specific conditions a differentiating function gives the following equality: $\lim_{n\to\infty} f(n+1) - f(n) = \lim_{x\to\infty} (f(x))' $

However, I'm not sure yet what exactly those conditions are in order to precisely know where
I may apply this rule or not. If you wanna take a look over my posted problem here you'll immediately notice that this rule applies for that case. I really appreciate if you help me clarify this.

  • 0
    I think that this rule has a good helping potential when dealing with some limits.2012-08-05

1 Answers 1

5

I believe that the minimal assumptions are:

  1. $f \colon [0,+\infty) \to \mathbb{R}$ is differentiable;
  2. $\lim_{x \to +\infty} f'(x)$ exists.

Then you easily check that $\lim_{n \to +\infty} f(n+1)-f(n)=\lim_{x \to +\infty} f'(x),$ since you can apply Lagrange's theorem: $f(n+1)-f(n)=f'(\xi_n)$ for a suitable $\xi_n \in (n,n+1)$.

I think that the two limits are not equivalent to each other, since it may be impossible to control $f'$ by using the values of $f$ at discrete points.

  • 2
    Yes, the existence of $\lim_{n\rightarrow \infty} f(n+1)-f(n)$ does not imply the existence of $\lim_{x\rightarrow \infty} f'(x)$, as can be seen with $ f(x) = \sin(\pi x)$.2012-08-05