5
$\begingroup$

I have a simple (few variables), continuous, twice differentiable convex function that I wish to minimize over the unit simplex. In other words, $\min. f(\mathbf{x})$, $\text{s.t. } \mathbf{0} \preceq x$ and $\mathbf{1}^\top \mathbf{x} = {1}$. This will have to be performed multiple times.

What is a good optimization method to use? Preferably fairly simple to describe and implement?

There are some really fancy methods (such as “Mirror descent and nonlinear projected subgradient methods for convex optimization”, Beck and Teboulle) that specifically have methods for minimizing over the unit simplex. But these methods only use the gradient and not the Hessian.

  • 0
    If the function is convex, why wouldn't a simple line search method work? Since the objective function is twice-differentiable, I'd think Newton's method would work fine -- if only once-differentiable, gradient descent should work. Both are extremely well documented online with pseudocode and application-specific code abound.2011-12-06
  • 0
    How about the "Karush–Kuhn–Tucker conditions"? http://en.wikipedia.org/wiki/Karush%E2%80%93Kuhn%E2%80%93Tucker_conditions2011-12-06
  • 0
    Thanks for the kind reply. I was wondering if there is a specific algorithm for minimization over the unit simplex that uses the Hessian (since it is easily calculated and the problem size is small). (I am still new to convex optimization) In the end I implemented the Interior Point method and it works well.2011-12-12
  • 2
    This answer probably comes to late, but have you already looked into the paper "Projected newton methods for optimization problems with simple constraints" by Bertsekas (SIAM J. Control and Optimization, Vol. 20, No. 2, March 1982)? As the title says, a projected Newton algorithm is introduced together with an Armijo-like linesearch. Also, superlinear convergence is proved for simple constraints. The above problem is listed as an example.2013-10-26
  • 0
    See the paper... Efficient Projections onto the l1-Ball for Learning in High Dimensions This paper can perfectly solve your objective.2013-11-22
  • 0
    No author? No source? This is hardly a helpful answer!2013-11-22
  • 0
    If this is all you require and the problem size is small, I would highly recommend an interior point method! E.g. you can turn the inequality into a constraint of the form $-\alpha\log(x)$ and send $\alpha \to 0$ to be a hard wall constraint. See Boyd's Convex Optimization, ch. 11 (I believe, haven't taught the class in a while) on interior point methods for a simple implementation (should be no more than 30-50 lines of python).2017-08-22

2 Answers 2