Forward mode AD works faster for functions R^m -> R^n where m << n, e.g. 
when you have a function of 1 input and return tuple or vector of multiple 
outputs.
Reverse mode AD works faster for functions R^m -> R^n where m >> n, e.g. 
when you have a function of a vector of inputs and return a single output 
(say, typical loss function in machine learning). 

As for packages, ReverseDiffSource generates expressions for derivatives, 
but currently supports only simple `for` loops and the author works on 
adding support for `if`s 
<https://github.com/JuliaDiff/ReverseDiffSource.jl/issues/39>. AFAIK, 
ForwardDiff doesn't have any restrictions on a program structure and should 
work with any type of loops, conditions, recursion, etc., but can't be used 
for code generation (e.g. for GPU). 

I'm also currently working on Espresso.jl - package for expression 
manipulation and hybrid differentiation 
<https://github.com/dfdx/Espresso.jl#hybrid-differentiation> (AD pipeline, 
symbolic output) where I drop support for any kind of conditions and loops 
in favor of higher flexibility (e.g. differentiation for nested function 
calls which is an hard to achieve 
<https://github.com/JuliaDiff/ReverseDiffSource.jl/issues/38> in 
ReverseDiffSource). The package is in early testing stage yet, but 
differentiation over expressions with numbers and simple vectors seem to 
work fine, and I'm working hard to add derivatives of higher-order tensors. 


On Sunday, August 21, 2016 at 6:36:48 AM UTC+3, Benjamin Deonovic wrote:
>
> What are pros/cons of the various auto differentiation packages available 
> in Julia? Is there a benefit to using ReverseDiffSource vs ForwardDiff? 
> Thank you
>

-- 
You received this message because you are subscribed to the Google Groups 
"julia-stats" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to