your welcome :)
Le 26/08/2016 à 10:49, Deniz Yuret a écrit :
Sorry about that.  Here is the right link:

https://github.com/denizyuret/AutoGrad.jl


On Fri, Aug 26, 2016 at 11:38 AM Henri Girard <henri.gir...@gmail.com <mailto:henri.gir...@gmail.com>> wrote:

    Your link autograd.jl seems dead ?


    Le vendredi 26 août 2016 08:51:30 UTC+2, Deniz Yuret a écrit :

        Announcing AutoGrad. <http://AutoGrad.>jl
        <https://github.com/denizyuret/AutoGrad.jl>: an automatic
        differentiation package for Julia. It is a Julia port of the
        popular Python autograd
        <https://github.com/HIPS/autograd> package. It can
        differentiate regular Julia code that includes loops,
        conditionals, helper functions, closures etc. by keeping track
        of the primitive operations and using this execution trace to
        compute gradients. It uses reverse mode differentiation
        (a.k.a. backpropagation) so it can efficiently handle
        functions with array inputs and scalar outputs. It can compute
        gradients of gradients to handle higher order derivatives.

        Large parts of the code are directly ported from the Python
        autograd <https://github.com/HIPS/autograd> package. I'd like
        to thank autograd author Dougal Maclaurin for his support. See
        (Baydin et al. 2015) <https://arxiv.org/abs/1502.05767> for a
        general review of automatic differentiation, autograd tutorial
        <https://github.com/HIPS/autograd/blob/master/docs/tutorial.md> for
        some Python examples, and Dougal's PhD thesis for design
        principles. JuliaDiff <http://www.juliadiff.org/> has
        alternative differentiation tools for Julia.

        best,
        deniz


Reply via email to