1
$\begingroup$

Let $f: \mathbb{R} \to \mathbb{R}$ be (at least) differentiable and let $x > 0$. For any $\epsilon > 0$, the Mean Value Theorem furnishes a point $c_\epsilon \in (\epsilon, x)$ such that f(x) - f(\epsilon) = f'(c_\epsilon)(x - \epsilon) For the same reason, there is also a point $c \in (0,x)$ such that f(x) - f(0) = f'(c)(x - 0) = f'(c)x

I have the following questions:

i) Under what conditions is it true that \lim_{\epsilon \to 0} \, f'(c) - f'(c_\epsilon) = 0

ii) Can anything at all be said about the limit \lim_{\epsilon \to 0} \frac{f'(c) - f'(c_\epsilon)}{\epsilon}

iii) In particular, is there a non-trivial class of functions such that \lim_{\epsilon \to 0} \frac{f'(c) - f'(c_\epsilon)}{\epsilon} = 0

Feel free to impose any additional conditions on the function $f$. Of course, I realize that question ii) is very vague; I am not really expecting an answer to that one. However if you happen to know of any results related to this, I would be very happy if you shared them.

Thanks in advance.

EDIT: The original question doesn't quite make sense. There could of course be several points $t \in (\epsilon, x)$ such that f(x) - f(\epsilon) = f'(t)(x - \epsilon) := L(x - \epsilon). However, if $f \in C^1(\mathbb{R})$ then f'^{-1}(L) is closed. Since it is also contained in $(\epsilon, x)$ it is compact. So define $c_\epsilon$ to be the minimum of all such $t$ and do similarly for $c$. Of course it would also be interesting if anything can be said in the case where c_\epsilon \in f'^{-1}(L) is an arbitrary choice.

  • 0
    @ruakh that is true. I just mentioned it since I explicitly define $c_\epsilon$ and it could refer theoretically to several different points.2011-11-28

1 Answers 1

-1

i) Under all conditions. Consider: \begin{array}{lcl} f'(c) & = & \frac{f(x)-f(0)}{x} & \text{by definition} \\ & = & \frac{\lim_{\epsilon \to 0^+}(f(x))-f(\lim_{\epsilon \to 0^+}\epsilon)}{\lim_{\epsilon \to 0^+}(x)-\lim_{\epsilon \to 0^+}(\epsilon)} \\ & = & \frac{\lim_{\epsilon \to 0^+}(f(x))-\lim_{\epsilon \to 0^+}(f(\epsilon))}{\lim_{\epsilon \to 0^+}(x)-\lim_{\epsilon \to 0^+}(\epsilon)} & \text{because} \ f\ \text{is differentiable, therefore continuous, at} \ 0^+ \\ & = & \frac{\lim_{\epsilon \to 0^+}(f(x)-f(\epsilon))}{\lim_{\epsilon \to 0^+}(x-\epsilon)} \\ & = & \lim_{\epsilon \to 0^+}\frac{f(x)-f(\epsilon)}{x-\epsilon} \\ & = & \lim_{\epsilon \to 0^+}f'(c_\epsilon) & \text{by definition} \end{array}

Therefore 0 = f'(c) - \lim_{\epsilon \to 0^+}f'(c_\epsilon) = \lim_{\epsilon \to 0^+}f'(c) - \lim_{\epsilon \to 0^+}f'(c_\epsilon) = \lim_{\epsilon \to 0^+}(f'(c) - f'(c_\epsilon)).

ii) \begin{array}{lcl} \lim_{\epsilon \to 0} \frac{f'(c) - f'(c_\epsilon)}{\epsilon} & = & \lim_{\epsilon \to 0} \frac{\frac{f(x)-f(0)}{x} - \frac{f(x)-f(\epsilon)}{x-\epsilon}}{\epsilon} & \text{by definition} \\ & = & \lim_{\epsilon \to 0} \left( \frac{f(0)-f(x)}{x (x - \epsilon)} + \frac{f(\epsilon)-f(0)}{\epsilon (x - \epsilon)} \right ) & \text{by algebra} \\ & = & \frac{f(0)-f(x)}{x^2} + \lim_{\epsilon \to 0} \frac{f(\epsilon)-f(0)}{\epsilon x} \\ & = & -\frac{f'(c)}{x} + \frac{f'(0)}{x} \\ & = & \frac{f'(0) - f'(c)}{x} \\ \end{array}

I'm reasonably confident that the above holds for any function meeting the assumptions. I tested it numerically for a number of "normal" functions (quadratics, sine, cosine, etc.), so at least there's nothing blatantly wrong with the algebra.

iii) It's clear from the above that this limit will equal zero if and only if $(x, f(x))$ is on the line that's tangent to $f$ at zero. The only way this can be a general property of $f$ is if $f$ is its tangent line at zero, that is, if $f$ is a line. But for other functions it can be true for specific values of $x$; for example, if $f(x) = \cos(x)$, then this limit equals zero whenever $x \in \{ 2n\pi \; | \; n \in \mathbb{N} \}$.


Edited to add: I mentioned above that I tested my answer for (ii) numerically for a few different choices of $f$ and $x$. Here is the C code that I used to do that (slightly modified, in an attempt to make it more readable by someone who doesn't know C):

#include  #include   #define PI (4 * atan(1))   double f(double x)   { return cos(x); }  double f_prime(double x)   { return -sin(x); }   double f_prime_c(double x)   { return (f(x) - f(0)) / x; }  double f_prime_c_epsilon(double x, double epsilon)   { return (f(x) - f(epsilon)) / (x - epsilon); }   int main() {   double x = PI / 3;    // first, print five successive approximations to the limit:   double epsilon;   if(x > 1.0)     epsilon = 1.0;   else     epsilon = x;   for(int i = 0; i < 5; ++i)   {     epsilon /= 10;      double approx =       (f_prime_c(x) - f_prime_c_epsilon(x, epsilon)) / epsilon;      printf("%f\t(epsilon=%f)\n", approx, epsilon);   }    printf("\n");     // next, print what we *expect* the limit to be:    printf("%f\t(epsilon=0; expected)\n", (f_prime(0) - f_prime_c(x)) / x);     return 0; } 

To compile and run that, assuming you have a Linux-y system available, you would save it as a file named, say, fun_with_mvt.c, and run this command:

gcc -std=c99 -Wall fun_with_mvt.c -o fun_with_mvt && fun_with_mvt 

And it should give output like this:

0.451338        (epsilon=0.100000) 0.455521        (epsilon=0.010000) 0.455903        (epsilon=0.001000) 0.455941        (epsilon=0.000100) 0.455945        (epsilon=0.000010)  0.455945        (epsilon=0; expected) 

To choose a different $f$ (and $f'$), change the lines that read { return cos(x); } and { return -sin(x); }. To choose a different $x$, change the line that reads double x = PI / 3;.

  • 0
    @PZZ: By the way, as I mentioned, I tested my answer for (ii) numerically. I've now edited my answer to give C code that can be used to perform this test. If you have a C compiler, you can play around with different choices of $f$ and $x$ and see for yourself how the limits look (though I would only trust this approach for choices of $f$ that are well-behaved near zero).2011-11-29