1
$\begingroup$

What is a "good" way to optimize a 1D convex function $f(x)$ without using the derivatives of that function ? By "good", I mean a method which exploits the function convexity and which minimizes the number of function evaluations (suppose that the evaluation $f(x)$ is expensive).

I am looking for a link to a well-known method tailored for this task or some simple explanation on how to implement this.

  • 0
    Do you have any constraints?2017-01-30
  • 0
    Have you performed a search on derivative-free optimization? Considered Nelder-Mead?2017-01-30
  • 0
    You can use Powell's method https://en.wikipedia.org/wiki/Powell's_method.2017-01-31

1 Answers 1

4

If you've got an initial set of upper and lower bounds on the location of the optimum, then Golden Section Search is an appropriate algorithm. A slightly more sophisticated algorithm that works very well in practice is Brent's algorithm. See Algorithms for Minimization Without Derivatives by Richard Brent.

If you don't have any bounds to start with, it's relatively easy to get them by picking three starting points (say $x=0$, $x=1$, and $x=2$), evaluating the function at those three points and using the information to obtain bounds or at least an upper or lower bound. If you've got an unbounded interval containing the minimum (say $[l,\infty)$), then you can evaluate $f$ at $l+2^{k}$, for $k=1, 2, \ldots$, until you've bracketed the minimum.

All of this assumes that there actually is a minimum. If $f(x)$ is unbounded you'll have to give up eventually...