12
$\begingroup$

Let $K$ be a nonempty compact convex subset of $\mathbb R^n$ and let $f$ be the function that maps $x \in \mathbb R^n$ to the unique closest point $y \in K$ with respect to the $\ell_2$ norm. I want to prove that $f$ is continuous, but I can't seem to figure out how.

My thoughts: Suppose $x_n \to x$ in $\mathbb R^n$. Let $y_n = f(x_n)$ and let $y = f(x)$. By the compactness of $K$, there is a convergent subsequence $(y_{k_n})$ that converges to some $y' \in K$. If $y \ne y'$, then $\|x-y\| < \|x-y'\|$. Furthermore, any point $z \ne y$ on the line segment joining $y,y'$ also satisfies $\|x-y\|<\|x-z\|$. I don't know where to go from here. Any tips?

  • 2
    +1 for context and giving it some serious thought before posting.2011-03-21
  • 1
    $\ell_2$ usually denotes the space of all square-summable sequences; do you mean the usual Euclidean norm?2011-03-21
  • 0
    i dont think you need to use convexity again (you only need it to show that the point is unique). the $y_n$ seem to be going to both $y$ and $y'$. argue that they cant using the triangle $x,y,y'$ (look at $|x-x_n|$ small wrt $|y-y'|$)2011-03-21
  • 0
    @yoyo: I presume you meant "compactness"? It's true that you don't need compactness again and it's only there to show that the point is unique; you do need convexity again, though, since otherwise the nearest point could "jump".2011-03-21
  • 0
    @joriki: actually, you don't need compactness at all, requiring $K$ to be closed and convex suffices. You can show that any sequence approximating the distance of $x$ to $K$ is Cauchy by considering the midpoints of that sequence and using the parallelogram law.2011-03-21

2 Answers 2

7

Consider two points $a$ and $b$, and for simplicity and without loss of generality, assume $f(a)=0$. Then the line through $f(a)$ and $f(b)$ can be described by $\lambda f(b)$, and all points with $0\le\lambda\le1$ are in $K$. The point nearest to some point $c$ on this line is determined by

$$(\lambda_c f(b) - c)\cdot f(b)=0$$

(the vector from the nearest point to $c$ is perpendicular to the line), which gives

$$\lambda_c =\frac{c\cdot f(b)}{f(b)\cdot f(b)}\;,$$

where we can divide by $f(b)\cdot f(b)$ since the case $f(b)=0=f(a)$ obviously doesn't destroy continuity. We know that $f(a)$ is the closest point to $a$ in $K$, and $f(b)$ is the closest point to $b$ in $K$. It follows that $\lambda_a \le 0$ and $\lambda_b \ge 1$. Putting everything together, we have

$$\|f(b)-f(a)\|=\|f(b)\|=1\|f(b)\|\le(\lambda_b-\lambda_a)\|f(b)\|=\frac{(b-a)\cdot f(b)}{f(b)\cdot f(b)} \|f(b)\| \le \|b-a\|\;.$$

Thus, for given $\epsilon$, you can take $\delta=\epsilon$ to get $\|b-a\|<\delta\Rightarrow \|f(b)-f(a)\|<\epsilon$.

  • 0
    I understand the proof and the idea behind. Very nice. Thanks.2011-03-23
  • 0
    @echoone: You're welcome. (I'm glad you understand the idea, too :-)2011-03-23
1

Another way, for the record:

Since $f(a)$ is optimal, $$(f(b) - f(a))\cdot (a - f(a)) \le 0.$$ Similarly, since $f(b)$ is optimal, $$(f(b)-f(a))\cdot(f(b) - b)\le 0.$$

When we sum these two inequalities and rearrange, we get $$\begin{align}\|f(a)-f(b)\|^2&\le (f(b)-f(a))\cdot(b-a) \\ & \le \|a-b\|\|f(a)-f(b)\|,\end{align}$$ with the second inequality by Cauchy-Schwarz. Dividing through, we are done.