5
$\begingroup$

I am wondering if the following statement is a canonical theorem in real analysis. Does anybody here know the exact reference for it? It maybe a corollary of the mean value theorem, I think. Motivated by the answer to this question, I curious about that if this statement is still true in the higher dimension case.

The following statement is a corollary of the Mean Value Theorem.

Let $x_0\in{\bf R}$ and $U(x_0)$ be the neighborhood of $x_0$.

$f:U(x_0)\to{\bf R}$

is continuous on $U(x_0)$ and differentiable on $U(x_0)\setminus\{x_0\}$. If the limit

$\lim_{x\to x_0;x\in U(x_0)\setminus\{x_0\}}f'(x)$

exists, then $f$ is differentiable at $x_0$ and

$f'(x_0)=\lim_{x\to x_0;x\in U(x_0)\setminus\{x_0\}}f'(x)$

Here is my question:

Is this statement still true for the higher dimension if $f:U(x_0)\subset{\mathbb R}^n\to{\mathbb R}^m ?$

Since I don't know any generalization of the MVT in the higher dimension, I think one may need to approach it directly by definition. On the other hand, if one can deal with it "component-wise", one may be able to reduce it to the one dimension case.

  • 0
    One last comment, a complement to this problem: Let $x_0\in {\mathbb R}$ and $U(x_0)$ be the neighborhood of $x_0$. $f:U(x_0) \rightarrow {\mathbb R}$. is continuous on $U(x_0)$ and differentiable on $U(x_0)\backslash \{x_0\}$. If the limit $\lim_{x \to x_0}f'(x)$ doesn't exist, $f$ can still be differentiable at $x_0$.2011-05-21

3 Answers 3

3

It also follows from L'Hospital's rule applied to $\frac{f(x)-f(x_0)}{x-x_0}$.

Added (to clarify for the ones who downvoted the answer)

Just to clarify a little, actually in this case using L'Hospital is really the same way as using Patrick proof implicitely.

L'H is proven using Cauchy's mean value Theorem (which by the way doesn't ask for $f$ not be constant). But in our case, $g(x)=x-x_0$, and hence Cauchy's Mean Value Theorem is exactly the standard Mean Value Theorem.

Last but not least, if you look over Patrick's proof he basically proves uusing the MVT the following: If $f(x), x-x_0$ are continuous on $U$, differentiable on $U-\{ x_0 \}$ and \lim_{x-x_0} \frac{f'(x)}{1} =L then $\lim_{x-x_0} \frac{f(x)-f(x_0)}{x-x_0} =L$.

While his proof is easier and much cleaner than simply using L'H, this is just a particular case on L'H... And L'H Theorem doesn't say we cannot use it in the trivial case, is just that we usually don't because in that case it is overkill. But it is not wrong....

  • 0
    Upvoted because of the edit. Nice one.2011-06-02
2

It is, in fact, a consequence of the mean value theorem ; supposing your neighborhood contains an open interval centered on $x_0$, call the limit of f'(c) to be $L$, take $x$ in this interval ; hence there exists $c$ such that f(x)-f(x_0) = f'(c)(x-x_0) \quad \Rightarrow \quad \frac{f(x) - f(x_0)}{x-x_0} = f'(c) \to L(x_0) because $c \to x_0$ when $x \to x_0$ and f'(c) \to L(x_0) = f'(x_0). There is a way of seeing this in the higher dimensional case ; in $\mathbb R^n$, use the same way of thinking, but instead of calling it the mean value theorem, use Taylor's Theorem which says that there exists a $c \in [x_0, x] \overset{def}{=} \{ y \in \mathbb R^n \, | \, y = \lambda x_0 + (1-\lambda) x_0, \, \lambda \in [0,1] \}$ such that \begin{align*} f(x) - f(x_0) &= f'(c)(x-x_0) \quad \Longrightarrow \quad \frac{f(x) - f(x_0) - L(x_0)(x-x_0)}{\| x-x_0 \|} \\ &= (f'(c) - L(x_0)) \left( \frac{x-x_0}{\| x-x_0 \| } \right) \to 0. \end{align*} My arguments are a little sketchy ; I'm just giving you an idea of what it looks like. You need to watch out what happens in higher dimension to what happens when you evaluate that the last term goes to $0$.

  • 0
    Oh, I see. Actually I was just too lazy to use align at that moment and I was pretty tired so I posted and got to sleep. I'll take care of my TeX a little more next time.2011-05-30
1

It is an essential feature of modern multivariate calculus that it can and should be done denominator-free.

We may assume that x_0=f(x_0)=\lim_{x\to 0}f'(x)=0 and have to prove that f'(0) exists and is $=0$, that is to say: For any $\epsilon>0$ there is a $\delta>0$ with $|f(x)|<\epsilon|x|$ as soon as $|x|<\delta$.

For the proof we have to resort to the mean-value theorem for functions $\phi: [a,b]\to{\mathbb R}$, because in this case it is enough that $\phi$ is differentiable in the interior $]a,b[$ and continuous on all of $[a,b]$.

So let an $\epsilon>0$ be given. By assumption there is a $\delta>0$ such that \|f'(x)\|<\epsilon for all $x$ with $0<|x|<\delta$. ${\it Fix}$ such an $x$. If $f(x)=0$ there is nothing to prove. Otherwise put $u:=f(x)/|f(x)|$ and consider the auxiliary function $\phi(t):=u\cdot f(t x)\quad (0\leq t\leq 1)\ .$ Then by the chain rule we get |f(x)|=|\phi(1)-\phi(0)|=|\phi'(\tau)|=|u\cdot (f'(\tau x).x)|\leq \|f'(\tau x)\| |x|< \epsilon|x|\ , as required.