1
$\begingroup$

Suppose I want to solve the constrained optimization problem $$ \inf_{\{x \in \mathbb{R}^d: g(x)=0\}} f(x), $$ where $f$ is convex and $g$ is lsc.
I know I can rewrite it as a minimax problem $$ \inf_{\{x \in \mathbb{R}^d: g(x)=0\}} \sup_{\lambda \in [0,\infty)}f(x) + \lambda g(x). $$

Can I rewrite this as $$ \inf_{\{x \in \mathbb{R}^d: g(x)=0\}} \sup_{\lambda \in (0,1])} (1-\lambda) f(x) + \lambda g(x)? $$ If not, under what conditions would this be possible?

  • 0
    Yes, since your $\inf$ still contains the constraint $g(x) = 0$. Is this intended?2017-01-26
  • 0
    You need $\lambda \in \mathbb{R}$ instead of nonnegative. And your last operation changes the objective value, whereas the first does not.2017-01-26
  • 0
    If $f$ and $g$ are non-negative then this should not be an issue right?2017-01-26
  • 1
    Not an issue for $\lambda \geq 0$ because then $g(x)=0$ is equivalent to $g(x) \leq 0$. However, the other comments (including gerw's) are still valid.2017-01-27

0 Answers 0