2
$\begingroup$

Consider a convex optimization problem.

$\min_{u\in\Re^k} f(u)$

s.t. $g_i(u)\leq0,\ i=1,\ldots,m$

Let $F(x)=F(u,\lambda)=(f'(u)+\sum_{i=1}^m\lambda_ig_i'(u),-g_1(u),\ldots,-g_m(u)):\Re^n\rightarrow\Re^n$, ($n=k+m$)

$G=\{x=(u,\lambda)\in\Re^n:\lambda\geq0\}$.

$f$ and $g_i$ are all convex. For simplicity, assume they belong to $C^1$.

Is $F(x)$ a monotone operator in $G$, i.e., $(F(x_1)-F(x_2))^T(x_1-x_2)\geq0$ always satisfied in G?

Thanks a lot.

1 Answers 1

2

First of all, notice that we have a sum of operators: $(f', 0)$ and $m$ operators of the form $(\lambda g' , -g)$. I claim that each of them is monotone. For the first one this is clear: it is essentially the gradient of convex function $f$. To deal with the rest, recall the inequality $g(v)\ge g(u)+g'(u)(v-u) $ which expresses the convexity of $g$. It follows that in the expression $(\lambda g'(u)- \mu g'(v)) (u-v) + (-g(u)+g(v)) (\lambda -\mu)$ the coefficients of $\lambda$ and of $\mu$ are both nonnegative. $\Box$

  • 0
    Thanks a lot. Seems I went one unnecessary step which hide the correct answer.2012-08-12