I am reading the book by B. Mordukhovich, Variational analysis and generalized differentiation I. On page 6 it is stated the following inclusion: $ \hat{N}_{\varepsilon }\left( \bar{x};\Omega \right) \supset \hat{N}\left( \bar{x};\Omega \right) +\varepsilon \mathbb{B}^{\ast }. $ $\mathbb{B}^{\ast }$ denotes the closed unit ball in the dual space $X^{\ast }$, and if $\Omega $ is convex, then for any $\varepsilon \geq 0$ we have: $ \hat{N}_{\varepsilon }\left( \bar{x};\Omega \right) =\{x^{\ast }\in X^{\ast }\mid \langle x^{\ast },x-\bar{x}\rangle \leq \varepsilon \Vert x-\bar{x} \Vert \text{ whenever }x\in \Omega \}. $ Furthermore $\hat{N}\left( \bar{x};\Omega \right) :=\hat{N}_{0}\left( \bar{x} ;\Omega \right) $. Mordukhovich says that for convex set $\Omega $ the above inclusion holds as equality. Unfortunately, I can't see why the reverse inclusion holds. I would be very grateful for the advice.
$\epsilon$-normals to convex sets
1 Answers
Proposition 1.
(Boris S. Mordukhovich, Variational Analysis and Generalized Differentiation I, page 5)
Let $\Omega$ be a nonempty convex set in a real Banach space $X$. Given $\bar{x}\in \Omega$ and $\varepsilon\geq 0$. Then $ \widehat{N}_\varepsilon(\bar{x}; \Omega):= \{x^*\in X^*| \limsup_{x\overset{\Omega}{\rightarrow}\bar{x}}\frac{\langle x^*, x-\bar{x}\rangle}{\|x-\bar{x}\|}\leq \varepsilon\} $ is convex and closed in the norm topology of $X^*$. Moreover, if $X$ is reflexive then it is weak$^*$ closed in $X^*$.
Prososition 2.
(Boris S. Mordukhovich, Variational Analysis and Generalized Differentiation I, Proposition 1.3)
Let $\Omega$ be a nonempty convex set in a real Banach space $X$. Then $ \widehat{N}_\varepsilon(\bar{x}; \Omega)=\{x^*\in X^*| \langle x^*, x-\bar{x}\rangle\leq \varepsilon\|x-\bar{x}\| \; \text{whenever}\; x\in\Omega\} $ for any $\varepsilon\geq 0$ and $\bar{x}\in \Omega$. In particular, $\widehat{N}(\bar{x}; \Omega)$ agrees with the normal cone of convex analysis, i.e. $ \widehat{N}(\bar{x}; \Omega)=\{x^*\in X^*| \langle x^*, x-\bar{x}\rangle\leq 0 \; \text{whenever}\; x\in\Omega\}. $
By using Proposition 1. and Proposition 2. we obtain the following result.
Proposition 3.
If $\Omega$ is a nonempty convex subset in a real Banach and reflexive space $X$ then $ \widehat{N}_\varepsilon(\bar{x}; \Omega)= \widehat{N}(\bar{x}; \Omega)+\varepsilon \mathbb{B}^*. $ Proof. $(\supset)$ Suppose that $x_0^*\in \widehat{N}(\bar{x}; \Omega)$ and $u^*\in \varepsilon \mathbb{B}^*$. Then, it follows from Proposition 2. that $ \langle x_0^*, x-\bar{x}\rangle\leq 0 \quad \forall x\in \Omega. $ Hence \begin{equation*} \begin{array}{lll} \langle x_0^*+u^*, x-\bar{x}\rangle&=&\langle x_0^*, x-\bar{x}\rangle+\langle u^*, x-\bar{x}\rangle\\ &\leq&0+\|u^*\|\|x-\bar{x}\|\\ &\leq& \varepsilon\|x-\bar{x}\| \end{array} \end{equation*} for all $x\in \Omega$. This implies that $x_0^*+u^*\in \widehat{N}_\varepsilon(\bar{x}; \Omega)$. Therefore $\widehat{N}_\varepsilon(\bar{x}; \Omega)\supset\widehat{N}(\bar{x}; \Omega)+\varepsilon \mathbb{B}^*$.
$(\subset)$ Let $\widehat{N}^*:=\widehat{N}(\bar{x}; \Omega)+\varepsilon \mathbb{B}^*$. Since $X$ is reflexive, it follows from Proposition 2. that $\widehat{N}(\bar{x}; \Omega)$ is convex and weak$^*$ closed in $X^*$. Moreover $\varepsilon \mathbb{B}^*$ is convex and weak$^*$ compact in $X^*$. Hence $\widehat{N}^*$ is nonempty ($0\in \widehat{N}^*$), weak$^*$ closed and convex in $X^*$.
Suppose that there exists $x^*\in X^*$ such that $ x^*\in \widehat{N}_\varepsilon(\bar{x}; \Omega) \; \text{and} \; x^*\notin \widehat{N}^*. $ By the separation theorem (see W. Rudin, Functional Analysis, Theorem 3.4(b)) there exists $x\in X$ such that $ \langle x^*, x\rangle>\sup_{f^*\in \widehat{N}^*}\langle f^*, x\rangle $ It follows from the above inequality that $ \begin{cases} \langle x^*, x\rangle>\langle f_0^*, x\rangle \quad \forall f_0^*\in \widehat{N}(\bar{x}; \Omega),&\\ \langle x^*, x\rangle>\langle f_1^*, x\rangle \quad \forall f_1^*\in \varepsilon\mathbb{B}^*.& \end{cases} $ Since $\widehat{N}(\bar{x}; \Omega)$ is cone, we have $ \begin{cases} 0\geq\langle f_0^*, x\rangle \quad \forall f_0^*\in \widehat{N}(\bar{x}; \Omega),&\\ \langle x^*, x\rangle>\varepsilon\|x\|.& \end{cases} $ By Proposition 2. $\widehat{N}(\bar{x}; \Omega)$ agrees with normal cone in convex analysis and so $ \begin{cases} x\in (\widehat{N}(\bar{x}; \Omega))^*=T(\bar{x}; \Omega)=\overline{\text{cone}(\Omega-\bar{x})},\\ \langle x^*, x\rangle>\varepsilon\|x\|,& \end{cases} $ where $T(\bar{x};\Omega)$ is the tangent cone of $\Omega$ at $\bar{x}$. Then, there exist $\{t_k\}\subset\mathbb{R}^+$ and $\{x_k\}\subset\Omega$ such that $t_k(x_k-\bar{x})\rightarrow x$. Hence, for sufficiently large $k$ we have $ \langle x^*,t_k(x_k-\bar{x}) \rangle>\varepsilon\|t_k(x_k-\bar{x})\| $ or equivalently $ \langle x^*,x_k-\bar{x} \rangle>\varepsilon\|x_k-\bar{x}\|. $ This implies that $x^*\notin \widehat{N}_\varepsilon(\bar{x}; \Omega)$, which is an absurd.
-
0@Dusan: You are welcome. – 2012-10-28