Part of the problem here is that the set $A$ need not have any `interior' points. Back in the one dimensional case, think of a quadratic polynomial on the real line- generally speaking $A$ will be a two point set containing the $x$-values of the local max and min.
Anyway it seems to me that if $f$ is continuous, then, in particular $f$ is continuous on $A$ and therefore achieves a max and a min, so if $x$ and $y$ are not close together then your inequality is obvious. Now what happens if $x$ and $y$ are close together?
Update: Basically you can just use Feanor's suggestion for the case when $x$ is close to $y$. Here is one way you might think of it: for any $x_0 \in A$, Taylor's theorem tells us that (since $\nabla f(x_0) = 0$) $f(x) = f(x_0) + h^2g(x)$ for some function $g(x)$ that is bounded in a small neighborhood of radius $\epsilon$ about $x_0$. Here $h = |x - x_0|$ and we assume that $x$ is in the $\epsilon$-neighborhood about $x_0$. This gives you the result you want for all $x$ that are close to $x_0$, by choosing $\lambda$ to be an upper bound for $|g(x)|$. You can then cover $A$ by a finite number of such balls of radius $\epsilon$, and you can therefore deduce that if $x$ and $y$ lie in any one of these balls then you have your estimate. If you happen to pick numbers $x$ and $y$ that do not lie in a single one of the covering balls then you know that $|x -y| \geq \epsilon$ and so the argument I mentioned at the top of the page works in this case.