Hi everyone,

I'm currently using the Python wrappers of NLopt.  I'm working on a
non-convex least squares problem, where the local search portion of the
optimization is best handled by a LS algorithm like Levemberg Marquardt
(the SciPy implementation, or any other one, really).

Originally, I was thinking of simply doing something like Latin Hypercube
Sampling to explore the full parameter landscape, but I was wondering if I
could combine the local least-squares optimization step with one of the
global algorithms in NLopt like MLSL to search the whole parameter
landscape more efficiently.

Federico
_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to