On Jan 5, 2011, at 4:08 AM, Yury V. Zaytsev wrote:
Do I understand correctly that all of the gradient-based optimization
methods included in nlopt do not have built-in finite differences or
automatic differentiation approximators, such as, for instance NumPy
fmin_l_bfgs_b and one has to either provide gradients by programming
the
analytic formulas, or perform numeric approximations on his own?
In general, as mentioned in the manual, it is a good idea to use
analytic derivatives when using the gradient-based algorithms. If you
don't have an analytical derivatives, one usually uses the derivative-
free algorithms (which often internally construct their own
approximate gradient for you, but with only just as much accuracy as
they need to make progress). Analytical derivatives can be computed
far more cheaply than difference approximations, especially when you
have many parameters, and also far more accurately. Compute
difference approximations accurately can be tricky, because the step
size dx obviously cannot be too large, but it also cannot be too small
(as otherwise roundoff errors will kill you).
However, since many people seem to want this, I have been thinking
about adding an numerical-differentiation interface to NLopt.
Especially for the case of constrained optimization, it might be nice
to be able to try a local algorithm other than COBYLA.
Steven
_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss