Thanks for all your answers! Just to make it clear, at the moment I'm not 
really interested in TensorFlow itself, but specifically in its automatic 
differentiation capabilities. 

ReverseDiffSource.jl looks very promising and is indeed quite fast for `R^n 
-> R` in a few experiments I've made. Thanks again!

On Saturday, July 9, 2016 at 3:02:55 AM UTC+3, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to 
> perform automatic differentiation over their computational graphs. E.g. in 
> TensorFlow (example from SO 
> <http://stackoverflow.com/questions/35226428/how-do-i-get-the-gradient-of-the-loss-at-a-tensorflow-variable>
> ): 
>
> data = tf.placeholder(tf.float32)
> var = tf.Variable(...)              
> loss = some_function_of(var, data)
>
> var_grad = tf.gradients(loss, [var])[0]
>
> What is the closest thing in Julia at the moment? 
>
> Here's what I've checked so far: 
>
>  * ForwardDiff.jl <https://github.com/JuliaDiff/ForwardDiff.jl> - it 
> computes derivatives using forward mode automatic differentiation (AD). 
> Although AD has particular advantages, I found this package quite slow. 
> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
> the function itself. Another potential issues is that ForwardDiff.jl 
> doesn't output symbolic version of gradient and thus is hardly usable for 
> computation on GPU, for example. 
>  * *Calculus.jl* <https://github.com/johnmyleswhite/Calculus.jl> - among 
> other things, this package provided symbolic differentiation. However, it 
> seems to consider all symbols to be numbers and doesn't support matrices or 
> vectors. 
>
> I have pretty shallow knowledge of both these packages, so please correct 
> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
> package or project that I should consider? 
>

Reply via email to