This is really super cool.

On Fri, Dec 20, 2013 at 10:55 AM, Miles Lubin <miles.lu...@gmail.com> wrote:

> Automatic differentiation functionality using dual numbers has been
> integrated with Optim with the "autodiff" keyword option and released
> as version 0.2.0:
>
> rosenbrock(x::Vector) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
>
> optimize(rosenbrock, [0.0, 0.0], method = :l_bfgs, autodiff = true)
>
> When this option is set, exact numerical derivatives will be computed
> by passing in a Vector{Dual{Float64}} to the input function. The
> function must be written to accept a generic Vector, and should be
> type stable (e.g. using zero(eltype(x)) instead of 0.0 for
> accumulators). Using dual numbers has about the same cost as computing
> gradients via finite differences, but the resulting gradient has much
> greater numerical accuracy. It is still more efficient to provide your
> own gradient function if available. Reverse-mode AD is in planning.
>
> We encourage Optim users to test out this functionality and report any
> issues; we're considering making it the default.
>
> Miles
>
>
> On Wed, Dec 11, 2013 at 5:23 PM, Miles Lubin <miles.lu...@gmail.com>
> wrote:
> > PR to make this work transparently with Optim:
> > https://github.com/JuliaOpt/Optim.jl/pull/39
>

Reply via email to