Forward differentiation has a bad complexity for functions of the form R^n 
-> R. try using ReverseDiffSource.jl instead

This blog posts describes positive results using ReverseDiffSource.jl on an 
autoencoder

http://int8.io/automatic-differentiation-machine-learning-julia/#Training_autoencoder_8211_results

since back-propagation is reverse differentiation, this should in theory be 
equivalent to tensor flow's automatic differentiation.

On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to 
> perform automatic differentiation over their computational graphs. E.g. in 
> TensorFlow (example from SO 
> <http://stackoverflow.com/questions/35226428/how-do-i-get-the-gradient-of-the-loss-at-a-tensorflow-variable>
> ): 
>
> data = tf.placeholder(tf.float32)
> var = tf.Variable(...)              
> loss = some_function_of(var, data)
>
> var_grad = tf.gradients(loss, [var])[0]
>
> What is the closest thing in Julia at the moment? 
>
> Here's what I've checked so far: 
>
>  * ForwardDiff.jl <https://github.com/JuliaDiff/ForwardDiff.jl> - it 
> computes derivatives using forward mode automatic differentiation (AD). 
> Although AD has particular advantages, I found this package quite slow. 
> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
> the function itself. Another potential issues is that ForwardDiff.jl 
> doesn't output symbolic version of gradient and thus is hardly usable for 
> computation on GPU, for example. 
>  * *Calculus.jl* <https://github.com/johnmyleswhite/Calculus.jl> - among 
> other things, this package provided symbolic differentiation. However, it 
> seems to consider all symbols to be numbers and doesn't support matrices or 
> vectors. 
>
> I have pretty shallow knowledge of both these packages, so please correct 
> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
> package or project that I should consider? 
>

Reply via email to