On Nov 25, 2013, at 9:56 AM, Adel M <[email protected]> wrote:

> Is it possible to add constraints to derivative-free optimization algorithms?

Some of them, e.g. COBYLA, already support constraints.  It says in the manual 
which algorithms support nonlinear constraints.

Other algorithms like Nelder-Mead in NLopt don't currently support nonlinear 
constraints, but they could be used in conjunction with the AugLag algorithm to 
implement constraints.

> 
> I have an optimization problem of an X vector of size n (n >20) and have a 
> sum constraint on some X[i]: example X[1]+X[2]+..+X[10]=1

A simple linear constraint like that can be implemented by elimination.   Just 
optimize over X[1…9] and set X[10] = 1 - (X[1]+…+X[9]).

(Unless there is also a bound constraint on X[10]?)

_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to