On Tue, Mar 8, 2011 at 2:49 PM, Steven G. Johnson <[email protected]> wrote: > Assuming Peter is using standard terminology, automatic differentiation does > *not* mean numerical differentiation with finite differences. Automatic > differentiation means calculating the exact derivative analytically, just > using a computer program rather than doing it by hand: an AD differentiaties > your source code symbolically. > > So, automatic differentiation can in principle be the same efficiency as > programming analytical derivatives by hand, and with it the gradient-based > algorithms are probably far superior to the derivative-free algorithms for > this many unknowns. > > In principle, since AD takes source code to compute an objective and > generates source code to compute objective+gradient, you could possibly edit > the resulting code to remove stuff only needed for the computation of the > objective. But editing program-generated code by hand is messy, and is > fragile because you need to re-do it each time you change your objective, so > I can understand why Peter wouldn't want to do it.
Ah, duh, I was tunnel visioned on finite differences because of the statements that the differentiation was very expensive compared to the objective and that 'computing the derivative involves computing the function'. Don't mind me. :) _______________________________________________ NLopt-discuss mailing list [email protected] http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
