Deniz thanks for the package. I plan on reviewing it this week to decide if
it's a good fit for JuliaML. We're in the market for fast and well typed
backprop.

Mose: what version of julia are you on? Anonymous functions and closures
are much faster on 0.5... In fact there should be no performance penalty vs
regular functions, which allows you to rethink your paradigm.

On Monday, August 29, 2016, Deniz Yuret <[email protected]> wrote:

> Hi Mosè,
>
> Thanks for the wonderful feedback!
>
> AutoGrad has some overhead for recording and differentiating primitive
> operators.  However I think there is room for improvement - the current
> design has elements of the original Python package, anonymous functions,
> closures, etc. that are probably not very efficient.  I am working on a
> more efficient design.
>
> Thanks for the derivative info as well!  And here I was thinking: who is
> going to care about the missing derivative of the airy function :)  I
> better get to work...
>
> best,
> deniz
>
>
> On Mon, Aug 29, 2016 at 2:13 AM Mosè Giordano <[email protected]
> <javascript:_e(%7B%7D,'cvml','[email protected]');>> wrote:
>
>> Hi Deniz,
>>
>> Announcing AutoGrad.jl <https://github.com/denizyuret/AutoGrad.jl>: an
>>> automatic differentiation package for Julia. It is a Julia port of the
>>> popular Python autograd <https://github.com/HIPS/autograd> package. It
>>> can differentiate regular Julia code that includes loops, conditionals,
>>> helper functions, closures etc. by keeping track of the primitive
>>> operations and using this execution trace to compute gradients. It uses
>>> reverse mode differentiation (a.k.a. backpropagation) so it can efficiently
>>> handle functions with array inputs and scalar outputs. It can compute
>>> gradients of gradients to handle higher order derivatives.
>>>
>>> Large parts of the code are directly ported from the Python autograd
>>> <https://github.com/HIPS/autograd> package. I'd like to thank autograd
>>> author Dougal Maclaurin for his support. See (Baydin et al. 2015)
>>> <https://arxiv.org/abs/1502.05767> for a general review of automatic
>>> differentiation, autograd tutorial
>>> <https://github.com/HIPS/autograd/blob/master/docs/tutorial.md> for
>>> some Python examples, and Dougal's PhD thesis for design principles.
>>> JuliaDiff <http://www.juliadiff.org/> has alternative differentiation
>>> tools for Julia.
>>>
>>> best,
>>> deniz
>>>
>>>
>> Very nice package!  It's one of the most complete yet easy-to-use
>> automatic differentiation packages I've checked out.  Some comments below.
>>
>> I expected (or hoped) that performance of the function returned by "grad"
>> was comparable to that of the actual derivative/gradient.  However:
>>
>> julia> using BenchmarkTools, AutoGrad
>>
>> julia> COS = grad(sin)
>> gradfun (generic function with 1 method)
>>
>> julia> @benchmark cos(0.0)
>> BenchmarkTools.Trial:
>>   samples:          10000
>>   evals/sample:     1000
>>   time tolerance:   5.00%
>>   memory tolerance: 1.00%
>>   memory estimate:  0.00 bytes
>>   allocs estimate:  0
>>   minimum time:     4.00 ns (0.00% GC)
>>   median time:      4.00 ns (0.00% GC)
>>   mean time:        4.05 ns (0.00% GC)
>>   maximum time:     15.00 ns (0.00% GC)
>>
>> julia> @benchmark COS(0.0)
>> BenchmarkTools.Trial:
>>   samples:          10000
>>   evals/sample:     1
>>   time tolerance:   5.00%
>>   memory tolerance: 1.00%
>>   memory estimate:  3.61 kb
>>   allocs estimate:  78
>>   minimum time:     9.14 μs (0.00% GC)
>>   median time:      10.30 μs (0.00% GC)
>>   mean time:        10.98 μs (3.46% GC)
>>   maximum time:     3.88 ms (98.04% GC)
>>
>>
>> I found similar results for other simple functions.  Is this to be
>> expected or is there room for improvements?
>>
>> I saw that the derivative dictionaries for most special functions are
>> incomplete.  You can give a look at the derivatives in file src/math.jl
>> <https://github.com/giordano/Measurements.jl/blob/master/src/math.jl#L148>
>> of my Measurements.jl <https://github.com/giordano/Measurements.jl>
>> package (I could use an automatic differentiation library in my package,
>> but I've never found one that suited my needs).  Look at the "result" calls
>> returned by the overloaded functions, the second argument is the derivative
>> or tuple of derivatives.
>>
>> I noticed that you marked the derivative of airyprime as wrong.  I guess
>> that you've been deceived by the misleading docstring of airy:
>> https://github.com/JuliaLang/julia/issues/17032  The second derivatives
>> of airy functions can be computed exploiting the Airy equation
>> <https://en.wikipedia.org/wiki/Airy_function>.
>>
>> Cheers,
>> Mosè
>>
>

Reply via email to