6
$\begingroup$

I have a simple (few variables), continuous, twice differentiable convex function that I wish to minimize over the unit simplex. In other words, $\min. f(\mathbf{x})$, $\text{s.t. } \mathbf{0} \preceq x$ and $\mathbf{1}^\top \mathbf{x} = {1}$. This will have to be performed multiple times.

What is a good optimization method to use? Preferably fairly simple to describe and implement?

There are some really fancy methods (such as “Mirror descent and nonlinear projected subgradient methods for convex optimization”, Beck and Teboulle) that specifically have methods for minimizing over the unit simplex. But these methods only use the gradient and not the Hessian.

  • 0
    If this is all you require and the problem size is small, I would highly recommend an interior point method! E.g. you can turn the inequality into a constraint of the form $-\alpha\log(x)$ and send $\alpha \to 0$ to be a hard wall constraint. See Boyd's Convex Optimization, ch. 11 (I believe, haven't taught the class in a while) on interior point methods for a simple implementation (should be no more than 30-50 lines of python).2017-08-22

2 Answers 2

0

The simplest, yet pretty fast method (In running time, not iteration time), would be Accelerated Projected Gradient Descent method.

All you need is to calculate the Gradient of the function $ f \left( x \right) $ and project each iteration onto the Unit Simplex.

A MATLAB Code would be:

simplexRadius = 1; stopThr = 1e-5; vX = vXInit;  for ii = 1:numIterations      vG = CalcFunGrad(vX); %

You can have the Simplex Projection function from the link I pasted (Or directly in my Ball Projection GitHub Repository).

0

The Exponentiated Gradient Descend algorithm may be another approach simple to implement.

The lecture material from the University of Chicago gives a nice overview..