On Nov 2, 2011, at 3:01 PM, Juan Carlos Lopez Alfonso wrote:
C++, in order to find more memory and less compute time. I have used the global optimization toolbox of matlab to solve an optimization problem with for example more than 2000 of variables. But now I need to extend my model and I will have about 3 millions of variables. Is possible to use NLopt with these numbers of optimization variables?
You can do local optimization with millions of variables, in principle, but only using the low-storage gradient-based methods (MMA, LBFGS, ...). How fast this converges will depend on your objective function, of course.
True global optimization in millions of variables is generally going to be impractical by any method unless your function is very special (e.g. convex). The best you can do is to explore several local minima from different random starting points. (I would say the same thing about global optimization with thousands of variables, BTW.)
My second question is related with the Hessian of the objetive function and the constraints. I have read the documentation and I dont find how I can define the hessians, however I can define the gradient of the objective function. Specifically, Is possible in NLopt to define manually the hessian of my objective function and the hessian for a constraint? I need to pass manually my hessians in order to avoid the evaluation of them by the optimization algorithms and ensure the validity of my results.
No, there are no methods in NLopt which take advantage of an analytically specified Hessian.
(Note that specifying the Hessian is impractical with millions of variables anyway, because it would be a millions x millions matrix. That's why quasi-Newton methods typically only use a low-rank approximation of the Hessian.)
_______________________________________________ NLopt-discuss mailing list [email protected] http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
