0
$\begingroup$

Let's say we have a smooth function $f:\mathbb{R}^{1000000} \rightarrow \mathbb{R}$, which we want to minimize using a method from numerical optimization. which method would we choose? Is the conjugate gradient method the best choice? What methods are better than others in the minimization process of high-dimensional problems?

Thank you very much for your time!

  • 2
    The answer is "it depends". The basic conjugate gradient algorithm only works when the function $f$ is a quadratic form, and it works best when the problem is sparse. Is that the case here? There are also nonlinear conjugate gradient methods that work with numerical derivatives, that may work well for you. Randomization algorithms like [simulated annealing](http://en.wikipedia.org/wiki/Simulated_annealing) can also work well in high dimensions. Can you give any more information about the problem?2012-07-21
  • 2
    I would echo @ChrisTaylor's remarks. The form of $f$ matters too. Do you have an explicit derivative? Is $f$ separable in some way, is it convex, etc...2012-07-21
  • 0
    I don't have any details on the form of $f$. I was just wondering if there are some preferred algorithms when you have functions of high dimensions. It's not a specific question about something particular, I was just wondering.2012-07-22

1 Answers 1