Are there any benchmark results for the "more performant and accurate" bit?
On Thursday, September 3, 2015 at 11:25:01 PM UTC+3, Jarrett Revels wrote: > > I'm proud to announce that we've tagged and released a new version > ForwardDiff.jl (https://github.com/JuliaDiff/ForwardDiff.jl). > > ForwardDiff.jl is a package for performing automatic differentiation > <https://en.wikipedia.org/wiki/Automatic_differentiation> on native Julia > functions/callable objects. The techniques used by this package *are more > performant and accurate than other standard algorithms for differentiation*, > so if taking derivatives is something you're at all interested in, I > suggest you give ForwardDiff.jl a try! > > If you don't already have the package, you can install it with Julia's > package manager by running the following: > > julia> Pkg.update(); Pkg.add("ForwardDiff") > > If you already have the old version of ForwardDiff.jl, you can update it > to the new one by simply running Pkg.update(). > > Note that *the new version of ForwardDiff.jl only supports Julia v0.4. *Julia > v0.3 users will have to stick to the old version of the package. Also note > that *the new version introduces some breaking changes*, so you'll > probably have to rewrite any old ForwardDiff.jl code you have (I promise > it'll be worth it). > > I've spent a good chunk of the summer overhauling it as part of my Julia > Summer of Code project, so I hope other folks will find it useful. As > always, opening issues and pull requests in ForwardDiff.jl's GitHub repo is > very welcome. > > I'd like to thank Julia Computing, NumFocus, and The Betty and Gordon > Moore Foundation for putting JSoC together. And, of course, I thank Miles > Lubin, Theo Papamarkou, and a host of other Julians for their invaluable > guidance and mentorship throughout the project. > > Best, > Jarrett > > >
