Can you explain me the difference between both algorithms? They look very similar.
Difference between conjugate gradient method and gradient descent
1
$\begingroup$
optimization
numerical-methods
algorithms
numerical-optimization
gradient-descent
-
0It would make more sense if you gave a context in which both of these notions are applied. The conjugate gradient method maintains a small but useful "memory" of what has been computed previously, while the notion of gradient descent only involves a current gradient direction for "descent". That might be the sort of distinction you would want to draw, but it can only become a more useful/concrete distinction in the setting of an intended application. – 2017-01-26