0
$\begingroup$

I am new in the topic of numerical optimization and I am interested in the solution of the following problem. Consider a two-parameter function like

$$ g(\omega_1, \omega_2) $$

such that I can numerically evaluate it but I don't know the analytical dependence of $g$ with respect to the parameters.

Is there a generic method to find the global minimum of $g$ with this information?

1 Answers 1

1

I am assuming you would like to find a minimum or a maximum of $g$ over some set in $\Omega_1 \times \Omega_2$, possibly the entire $\mathbb{R}^2$.

Assuming $g$ is nice (e.g., continuous) you can use conjugate direction methods, like Powell's Method. If $g$ is differentiable, there are better methods using the gradient if you can numerically compute it.

Here is a nice discussion on Gradient Descent method with some examples.

  • 0
    Thank you! Well, your assumption is correct. Do you know if there is any method for finding specifically the global minimum?2017-01-17
  • 0
    @dapias It makes no difference if you minimize or maximize, since if the maximum of $g$ is at $(a,b)$, then $(a,b)$ is the minimum of $-g$... Generally all optimization algorithms by default minimize for that reason.2017-01-17
  • 0
    you are absolutely right. I was wondering about the difference of finding local minima instead of the global one2017-01-17
  • 1
    @dapias Global is hard. If $g$ is convex, then they coincide and life is nice ;-). If not, you have to start your optimization in different parts of the space, or use something entirely different, like simulated annealing algorithms, and even then there are no guarantees.2017-01-17