0
$\begingroup$

Let's say we have a smooth function $f:\mathbb{R}^{1000000} \rightarrow \mathbb{R}$, which we want to minimize using a method from numerical optimization. which method would we choose? Is the conjugate gradient method the best choice? What methods are better than others in the minimization process of high-dimensional problems?

Thank you very much for your time!

  • 0
    I don't have any details on the form of $f$. I was just wondering if there are some preferred algorithms when you have functions of high dimensions. It's not a specific question about something particular, I was just wondering.2012-07-22

1 Answers 1

0

I don't know much about it, but what people do in the case Support Vector Machine kind of problems is to use the KKT conditions to dualize the problem and make it dimension independent. You could try to read about it and see if similar ideas can be applied in your case, for example http://en.wikipedia.org/wiki/Support_vector_machine

  • 0
    The asker didn't say anything about constraints, so neither the KKT conditions nor duality apply.2012-07-22