0
$\begingroup$

I recently encountered an optimization problem and looking for some technical paper for the same.

The problem is give as below (For $ f \left( \cdot \right) $ which is Convex and $ r \left( \cdot \right) $ which is Concave),

$\min f(x)+\lambda*r(x) $
$\ s.t \ x \geq 0, ||x||_1 = 1$.

where $x$ is a n-dimensional vector. Here $\ x \geq 0$ implies each component of $x$ is greater than 0. The role of regularization term is to push the entries to zero while incrementing the other entries. In the end, as the $\lambda$ increases, the $x$ will have all zeros and only one 1. This paper solved the above optimization problem without regularization term.

Could anyone point me to any algorithm for solving the above problem?

2 Answers 2

2

Both the constraints in your problem are linear. As Dirk has noted, objective is the difference of two convex functions. You may want to go through this paper for a relevant algorithm:

A. L. Yuille and A. Rangarajan, "The concave-convex procedure", Neural Computation, vol. 15, no. 4, pp. 915-936, 2003.

The concave-convex procedure is a special case of the broad class of Majorization-Minimization (MM) algorithms. The following tutorial is a good read:

D. R. Hunter and K. Lange, "A tutorial on MM algorithms", The American Statistician, vol. 58, no. 1, pp. 30-37.

Let me summarize the solution to your problem. Since $r(x)$ is concave, we can use the supporting hyperplane theorem and upper-bound it by a tangent at a given point $x_t$. That is:

$r(x) \leq r(x_t) + (x-x_t)r'(x_t)\quad\forall x$

This enables us to write an upper-bound on your objective as well:

$f(x) + \lambda r(x) \leq f(x) + \lambda [r(x_t) + (x-x_t)r'(x_t)]\quad\forall x$

The great thing about this upper bound is that it is convex in $x$, because $f(x)$ is convex and the other term is affine (which is both concave and convex). You can thus replace your objective with this convex relaxation, and solve the problem iteratively. As a simple exercise, you can show that the original objective function will not increase over successive iterations of this procedure.

1

I don't know what you mean by "solving the problem" but probably you ask for pointers to algorithms which could be helpful? In that case the answer depends on the properties of $f$ and $r$.

Note that the additional constraints $x\geq 0$ and $\|x\|_1=1$ can be phrased as $x\geq 0$ $\mathbf{1}^T x = 1$ and hence, are linear and this narrows the possible solver to the ones which can treat linear constraints.

  • 0
    Thanks a lot for your valuable comment. I didnt thought in those lines i.e., concave function is negative convex function.2012-08-18