At 12:10 on Thursday, June 28 2012, Matt Peddie wrote:
> At 12:02 on Thursday, June 28 2012, Steven G. Johnson wrote:
> > Just because automatic differentiation gives you the Hessian doesn't
> > mean that it is free (in terms of computer time).  Computation of
> > gradients by adjoint methods (or automatic differentiation in
> > "reverse" mode) is comprable in cost to computing the objective
> > function, but my understanding is that the computation of the Hessian
> > in general scales like the objective function multiplied by the number
> > of degrees of freedom.
> 
> Of course it's not computationally free; it's just no extra work for me.
> I must still evaluate the Hessian at each point, but if the method is
> already estimating its value at each point and then using it, I thought
> perhaps just evaluating an exact one and using it wouldn't be much more
> expensive and might take better steps.

It occurs to me that maybe you were suggesting that the process of
estimating the Hessian is much cheaper than evaluating the exact one.
If that's the case, then I understand, and it makes perfect sense to
stick with the estimated one.  Thank you!

Matt

_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to