Hi everyone,

I've been using NLopt to estimate the parameters of a system of
differential equations by minimizing the least squares between my
simulations and a set of experimental data.

The non-gradient based optimizations have worked very well.  I started
experimenting with using gradient based methods and estimating the jacobian
from the sensitivity equations calculated by sundials, and I found that
using the SciPy fmin_l_bfgs_b I'm able to obtain a local minima very
quickly.  All NLopt algorithms, however, quickly throw an NLopt error.

The system I am working with is quite gnarly - some parameters are very,
very sensitive, while others are completely robust.

SciPy bfgs:

http://pastie.org/8374252

NLopt bfgs:

http://pastie.org/8374254

I've tried changing NLopt algorithms, but every single gradient based
method I tried eventually throws an error.  Any clue what might be causing
it?  I am happy to share the code, but this is a fairly large wrapper
around SciPy and Assimulo ODEINT solvers, so posting a minimal example is
not very easy.

Federico
_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to