Yes, I tried COBYLA, but despite of using finite differences, both the SQP and 
MMA were more efficient. In addition, COBYLA didn't converge for larger scale 
problems (I think the problem is that my starting point is close to 
infeasibility). Unfortunately there is no simple way to use the adjoint 
operator in my case.


-------- Original-Nachricht --------
> Datum: Wed, 29 Feb 2012 13:09:24 -0500
> Von: "Steven G. Johnson" <[email protected]>
> An: nlopt-discuss <[email protected]>
> Betreff: Re: [NLopt-discuss] MMA inner iterations

> 
> On Feb 29, 2012, at 1:03 PM, Sascha Merz wrote:
> 
> > Thanks a lot for the quick and helpful reply (since I use finite  
> > differences to compute gradients I will try the original Svanberg  
> > algorithm)!
> 
> Have you considered using a derivative-free algorithm like COBYLA?  Or  
> computing the derivatives analytically by an adjoint method?  One  
> should almost always avoid finite-difference derivatives.
> 
> _______________________________________________
> NLopt-discuss mailing list
> [email protected]
> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

-- 
Empfehlen Sie GMX DSL Ihren Freunden und Bekannten und wir
belohnen Sie mit bis zu 50,- Euro! https://freundschaftswerbung.gmx.de

_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to