7
$\begingroup$

Let $f\in C^{2}( \mathbb{R}^{2} )$. Suppose that $\triangledown f=0 $ on a compact set $A\subseteq \mathbb{R}^{2}$. I want to prove that there is a strictly positive constant $\lambda > 0$ such that: $\left | f\left ( x \right )-f\left ( y \right ) \right |\leq \lambda \left | x-y \right |^{2}$ for all $x,y$ in $A$.

What I had in mind is that if the gradient of f is zero on a compact set $A$, then the function $f$ is constant and the inequality is obvious.

Any suggestions to solve this problem?

  • 2
    The argument has to be more involved than what you write. Think about $A=\{0\}\cup\{1/n;n\in\mathbb N\}$ in dimension $1$.2012-03-02

2 Answers 2

4

One of the formulations of the Taylor theorem tells you that for any $x,y$ you can approximate: $ f(y) = f(x) + \nabla f(x) \cdot (y-x) + O_x(|x-y|^2) $ where the $O_x$ notation is taken close to $x$. In fact, the constant inside $O_x$ can be approximated by the norm of the second derivative of $f$ on $[x,y]$ (times $1/2$, actually, and it is Taylor again). In partiular, if you take $\lambda$ to be maximal value of \frac{1}{2}|| f''|| on the convex hull $\mathrm{conv} A$, and have in whole generality: $ |f(y) - f(x) - \nabla f(x) \cdot (y-x)| \leq \lambda \cdot|x-y|^2 $ Now, the claim follows from $\nabla f(x) = 0$.

Sorry about being a little vague. I'm sure you can find an appropriate version of the Taylor's theorem in a notation you find convenient with a little search.

2

Part of the problem here is that the set $A$ need not have any `interior' points. Back in the one dimensional case, think of a quadratic polynomial on the real line- generally speaking $A$ will be a two point set containing the $x$-values of the local max and min.

Anyway it seems to me that if $f$ is continuous, then, in particular $f$ is continuous on $A$ and therefore achieves a max and a min, so if $x$ and $y$ are not close together then your inequality is obvious. Now what happens if $x$ and $y$ are close together?

Update: Basically you can just use Feanor's suggestion for the case when $x$ is close to $y$. Here is one way you might think of it: for any $x_0 \in A$, Taylor's theorem tells us that (since $\nabla f(x_0) = 0$) $f(x) = f(x_0) + h^2g(x)$ for some function $g(x)$ that is bounded in a small neighborhood of radius $\epsilon$ about $x_0$. Here $h = |x - x_0|$ and we assume that $x$ is in the $\epsilon$-neighborhood about $x_0$. This gives you the result you want for all $x$ that are close to $x_0$, by choosing $\lambda$ to be an upper bound for $|g(x)|$. You can then cover $A$ by a finite number of such balls of radius $\epsilon$, and you can therefore deduce that if $x$ and $y$ lie in any one of these balls then you have your estimate. If you happen to pick numbers $x$ and $y$ that do not lie in a single one of the covering balls then you know that $|x -y| \geq \epsilon$ and so the argument I mentioned at the top of the page works in this case.

  • 0
    I updated my answer to address your question.2012-03-05