D'oh! Forgot to include the line:

   julia> x = rand(10000);

On Friday, September 4, 2015 at 11:14:22 AM UTC-4, Jarrett Revels wrote:
>
> Are there any benchmark results for the "more performant and accurate" bit?
>>
>
> Very valid question! There are some benchmarks here, near the bottom of 
> this file 
> <https://github.com/mlubin/EuroAD2015/blob/master/forwarddiff.ipynb>. The 
> code used to run those benchmarks can be found here 
> <https://github.com/JuliaDiff/ForwardDiff.jl/tree/master/benchmarks>. 
> Those benchmarks are actually old, though - the current release should 
> actually be quite a bit faster than the version we ran those benchmarks 
> against (they were actually taken right before we fixed a major performance 
> regression - updating them is on my to-do list). 
>
> You may notice that we only really compare against other automatic 
> differentiation (AD) tools. That's because AD encompasses a whole different 
> class of techniques then what's referred to as "symbolic" or "numeric" 
> differentiation; results of AD algorithms are generally within machine 
> epsilon precision of the "real" answers, and these algorithms have provably 
> faster runtime complexities (e.g. potentially O(1) evaluations of the 
> target function using AD vs O(n) evaluations of the target function using 
> finite differencing). Thus, the benchmarks don't compare against non-AD 
> methods as it's not a very useful comparison for how fast the package is; 
> the non-AD methods are generally provably slower. 
>
> That being said, we should probably start benchmarking against traditional 
> methods just to track performance regressions, and to win over people who 
> haven't yet encountered AD. In that spirit, here's a benchmark comparing 
> Calculus's gradient method with ForwardDiff's:
>
>     julia> function ackley(x)
>                a, b, c = 20.0, -0.2, 2.0*π
>                len_recip = inv(length(x))
>                sum_sqrs = zero(eltype(x))
>                sum_cos = sum_sqrs
>                for i in x
>                    sum_cos += cos(c*i)
>                    sum_sqrs += i^2
>                end
>                return (-a * exp(b * sqrt(len_recip*sum_sqrs)) -
>                        exp(len_recip*sum_cos) + a + e)
>            end
>     ackley (generic function with 1 method)
>
>     julia> calc_g = Calculus.gradient(ackley);
>
>     julia> @time calc_g(x);
>       4.379173 seconds (69.50 k allocations: 1.137 MB)
>
>     julia> fd_g = ForwardDiff.gradient(ackley, chunk_size=10);
>
>     julia> @time fd_g(x);
>       0.523414 seconds (2.02 k allocations: 266.266 KB)
>
> Best,
> Jarrett
>
> On Friday, September 4, 2015 at 5:27:27 AM UTC-4, Johan Sigfrids wrote:
>>
>> Are there any benchmark results for the "more performant and accurate" 
>> bit?
>>
>> On Thursday, September 3, 2015 at 11:25:01 PM UTC+3, Jarrett Revels wrote:
>>>
>>> I'm proud to announce that we've tagged and released a new version 
>>> ForwardDiff.jl (https://github.com/JuliaDiff/ForwardDiff.jl).
>>>
>>> ForwardDiff.jl is a package for performing automatic differentiation 
>>> <https://en.wikipedia.org/wiki/Automatic_differentiation> on native 
>>> Julia functions/callable objects. The techniques used by this package *are 
>>> more performant and accurate than other standard algorithms for 
>>> differentiation*, so if taking derivatives is something you're at all 
>>> interested in, I suggest you give ForwardDiff.jl a try!
>>>
>>> If you don't already have the package, you can install it with Julia's 
>>> package manager by running the following:
>>>
>>>     julia> Pkg.update(); Pkg.add("ForwardDiff")
>>>
>>> If you already have the old version of ForwardDiff.jl, you can update it 
>>> to the new one by simply running Pkg.update().
>>>
>>> Note that *the new version of ForwardDiff.jl only supports Julia v0.4. 
>>> *Julia 
>>> v0.3 users will have to stick to the old version of the package. Also note 
>>> that *the new version introduces some breaking changes*, so you'll 
>>> probably have to rewrite any old ForwardDiff.jl code you have (I promise 
>>> it'll be worth it).
>>>
>>> I've spent a good chunk of the summer overhauling it as part of my Julia 
>>> Summer of Code project, so I hope other folks will find it useful. As 
>>> always, opening issues and pull requests in ForwardDiff.jl's GitHub repo is 
>>> very welcome.
>>>
>>> I'd like to thank Julia Computing, NumFocus, and The Betty and Gordon 
>>> Moore Foundation for putting JSoC together. And, of course, I thank Miles 
>>> Lubin, Theo Papamarkou, and a host of other Julians for their invaluable 
>>> guidance and mentorship throughout the project.
>>>
>>> Best,
>>> Jarrett
>>>
>>>
>>>

Reply via email to