5
$\begingroup$

Is there a notion of "best" convex relaxation of a particular function in a normed vector space? For example, the $\ell^0$ pseudo-norm can be relaxed into the $\ell^1$ norm, and that allows us to solve sparse recovery problems. Is there a general recipe to construct such relaxations given an arbitrary function?

Specifically, I have a function of the form $f(x) = \|x\|_2\|x\|_1$, which is non convex. How would I go about relaxing this? Any resources/links I should look into?

  • 0
    Actually, are you sure $\|x\|_2\|x\|_1$ is nonconvex?2015-07-31

1 Answers 1

4

A function is convex if and only if its epigraph $\big\{(\mathbf x,z):z\ge f(\mathbf x)\big\}$ is a convex set. So if your function is nonconvex, take the function corresponding to the convex hull of its epigraph. Other equivalent ways of describing this:

  • Let $\hat f$ be the largest convex function dominated by $f$. $\hat f(\mathbf x) = \max\big\{g(\mathbf x):\text{$g$ is convex and $g(\mathbf y)\le f(\mathbf y)$ for all $\mathbf y$}\big\}.$
  • Let $\hat f$ be the maximum of all affine functions dominated by $f$. $\hat f(\mathbf x) = \max\big\{\mathbf a\cdot\mathbf x + b:\text{$\mathbf a\cdot\mathbf y+b\le f(\mathbf y)$ for all $\mathbf y$}\big\}.$
  • Let $\hat f(\mathbf x)$ be the smallest $\sum\alpha_i f(\mathbf x_i)$ over all possible convex combinations $\sum\alpha_i\mathbf x_i=\mathbf x$. $\hat f(\mathbf x) = \textstyle\min\big\{\sum\alpha_i f(\mathbf x_i):\alpha_i\ge0,\sum\alpha_i=1,\sum\alpha_i\mathbf x_i = \mathbf x\big\}.$

(I'm being sloppy and assuming that the $\max$/$\min$ are attained and we don't need $\sup$/$\inf$ instead; I might be wrong.)

I haven't given any thought to your specific case of $f(\mathbf x)=\lVert\mathbf x\rVert_2\lVert\mathbf x\rVert_1$ yet. If I think of something, I'll edit it into this answer. Surely the symmetry of the function should help a lot here.

  • 0
    But the problem is not to minimize $f$. The problem is to find a convex function that best approximates $f$.2013-01-04