Hi Steven,

I am using NLopt now and I am quite happy with the results. 

Thank you!

On Sunday, January 19, 2014 4:29:00 PM UTC+1, Steven G. Johnson wrote:
>
> The NLopt package provides both gradient-based (where you have to supply 
> the analytical gradient) and derivative-free (where you only supply the 
> objective function) optimizers.
>
> It is not really practical to do optimization of "very" high dimensional 
> problems without knowing the gradient analytically.  (Normally you 
> shouldn't need to provide a Hessian, however.)   But I think of "very" as 
> being 1000s of dimensions; if you only have tens of dimensions, that is 
> fine for derivative-free optimizers.
>
> On Thursday, January 16, 2014 7:03:27 PM UTC-5, jbeginner wrote:
>
>> I am trying to use Julia's Ipopt interface for an optimization problem. I 
>> have two questions. Firstly, is it possible to only provide the objective 
>> function and starting values and not bother about the gradient, hessian, 
>> etc, or alternatively would providing the objective function and gradient 
>> suffice? I know that this greatly reduces the performance of the solver but 
>> is it possible? My function is very high dimensional and it would be very 
>> cumbersome to compute those manually. 
>>
>

Reply via email to