6
$\begingroup$

There are so many different optimization algorithms out there, and lots of research going on. However, I have difficulties to find good comparison between them, and all articles / books / papers seem to evaluate them in different ways.

Isn't there some set of standard functions to run "standardized" benchmarks in order to test and compare all such algorithms? Is there some reference website showing how each algorithm performs?

(I'm especially interested in numerical derivative-free optimization techniques)

  • 0
    @Mike: concerning your arguments, 2) I meant convergence speed per function evaluation and stability, therefore independent of its implementation. 1) I think on the contrary that many real world problems have to optimize "some" random functions. I doubt many can tweak existing optimization processes using specific problem knowledge in order to make them converge faster / more stable.2011-09-20

2 Answers 2

5

The CUTEr collection (https://magi-trac-svn.mathappl.polymtl.ca/Trac/cuter/wiki) bundles over 1000 optimization problems including the Hock and Schittkowski collection, the Maros and Meszaros collection, and more. Problems are modeled in the (a bit outdated) SIF modeling language. The main reference is http://dl.acm.org/citation.cfm?id=962439

EDIT: CUTEst, the updated version of CUTEr, is available from https://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki. The main reference is https://doi.org/10.1007/s10589-014-9687-3.

The collection is also available in the AMPL modeling language (along with lots of other problems): http://www.orfe.princeton.edu/~rvdb/ampl/nlmodels (see also https://github.com/mpf/Optimization-Test-Problems).

The COPS collection has discretized control problems: http://www.mcs.anl.gov/~more/cops/

On the same page, you will read about performance profiles which are a standard tool to compare several algorithms on a given problem collection.

You may also enjoy Performance World: http://www.gamsworld.org/performance

Hans Mittelmann maintains benchmark results for all sorts of optimization problems and solvers: http://plato.asu.edu/bench.html

Jorge Mor\'e has a website on benchmarking derivative-free optimization codes: http://www.mcs.anl.gov/~more/dfo/

0

Due to optimisation problems varying considerably you might want to creating a library of test problems that somewhat represent the problems you are trying to solve, optimising these using multiple optimisation algorithms and comparing their performance. For that an increasingly accepted method to compare optimisation software is performance profiles, as stated by @Dominique. A method to generate these in Python can be found here. For MATLAB see the post of @Dominique.