[julia-users] Re: Symbolic differentiation similar to TensorFlow / Theano

2016-07-09 Thread Andrei Zh
Thanks for all your answers! Just to make it clear, at the moment I'm not 
really interested in TensorFlow itself, but specifically in its automatic 
differentiation capabilities. 

ReverseDiffSource.jl looks very promising and is indeed quite fast for `R^n 
-> R` in a few experiments I've made. Thanks again!

On Saturday, July 9, 2016 at 3:02:55 AM UTC+3, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to 
> perform automatic differentiation over their computational graphs. E.g. in 
> TensorFlow (example from SO 
> 
> ): 
>
> data = tf.placeholder(tf.float32)
> var = tf.Variable(...)  
> loss = some_function_of(var, data)
>
> var_grad = tf.gradients(loss, [var])[0]
>
> What is the closest thing in Julia at the moment? 
>
> Here's what I've checked so far: 
>
>  * ForwardDiff.jl  - it 
> computes derivatives using forward mode automatic differentiation (AD). 
> Although AD has particular advantages, I found this package quite slow. 
> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
> the function itself. Another potential issues is that ForwardDiff.jl 
> doesn't output symbolic version of gradient and thus is hardly usable for 
> computation on GPU, for example. 
>  * *Calculus.jl*  - among 
> other things, this package provided symbolic differentiation. However, it 
> seems to consider all symbols to be numbers and doesn't support matrices or 
> vectors. 
>
> I have pretty shallow knowledge of both these packages, so please correct 
> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
> package or project that I should consider? 
>


[julia-users] Re: Symbolic differentiation similar to TensorFlow / Theano

2016-07-09 Thread Gabriel Goh
 

There is also


https://github.com/mlubin/ReverseDiffSparse.jl


I've never used it myself, but I thought i'd throw it out there.

On Saturday, July 9, 2016 at 12:27:33 AM UTC-7, Gabriel Goh wrote:
>
> Forward differentiation has a bad complexity for functions of the form R^n 
> -> R. try using ReverseDiffSource.jl instead
>
> This blog posts describes positive results using ReverseDiffSource.jl on 
> an autoencoder
>
>
> http://int8.io/automatic-differentiation-machine-learning-julia/#Training_autoencoder_8211_results
>
> since back-propagation is reverse differentiation, this should in theory 
> be equivalent to tensor flow's automatic differentiation.
>
> On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:
>>
>> In Python, libraries like TensorFlow or Theano provide possibility to 
>> perform automatic differentiation over their computational graphs. E.g. in 
>> TensorFlow (example from SO 
>> 
>> ): 
>>
>> data = tf.placeholder(tf.float32)
>> var = tf.Variable(...)  
>> loss = some_function_of(var, data)
>>
>> var_grad = tf.gradients(loss, [var])[0]
>>
>> What is the closest thing in Julia at the moment? 
>>
>> Here's what I've checked so far: 
>>
>>  * ForwardDiff.jl  - it 
>> computes derivatives using forward mode automatic differentiation (AD). 
>> Although AD has particular advantages, I found this package quite slow. 
>> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
>> the function itself. Another potential issues is that ForwardDiff.jl 
>> doesn't output symbolic version of gradient and thus is hardly usable for 
>> computation on GPU, for example. 
>>  * *Calculus.jl*  - among 
>> other things, this package provided symbolic differentiation. However, it 
>> seems to consider all symbols to be numbers and doesn't support matrices or 
>> vectors. 
>>
>> I have pretty shallow knowledge of both these packages, so please correct 
>> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
>> package or project that I should consider? 
>>
>

[julia-users] Re: Symbolic differentiation similar to TensorFlow / Theano

2016-07-09 Thread Gabriel Goh
Forward differentiation has a bad complexity for functions of the form R^n 
-> R. try using ReverseDiffSource.jl instead

This blog posts describes positive results using ReverseDiffSource.jl on an 
autoencoder

http://int8.io/automatic-differentiation-machine-learning-julia/#Training_autoencoder_8211_results

since back-propagation is reverse differentiation, this should in theory be 
equivalent to tensor flow's automatic differentiation.

On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to 
> perform automatic differentiation over their computational graphs. E.g. in 
> TensorFlow (example from SO 
> 
> ): 
>
> data = tf.placeholder(tf.float32)
> var = tf.Variable(...)  
> loss = some_function_of(var, data)
>
> var_grad = tf.gradients(loss, [var])[0]
>
> What is the closest thing in Julia at the moment? 
>
> Here's what I've checked so far: 
>
>  * ForwardDiff.jl  - it 
> computes derivatives using forward mode automatic differentiation (AD). 
> Although AD has particular advantages, I found this package quite slow. 
> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
> the function itself. Another potential issues is that ForwardDiff.jl 
> doesn't output symbolic version of gradient and thus is hardly usable for 
> computation on GPU, for example. 
>  * *Calculus.jl*  - among 
> other things, this package provided symbolic differentiation. However, it 
> seems to consider all symbols to be numbers and doesn't support matrices or 
> vectors. 
>
> I have pretty shallow knowledge of both these packages, so please correct 
> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
> package or project that I should consider? 
>


[julia-users] Re: Symbolic differentiation similar to TensorFlow / Theano

2016-07-08 Thread Chris Rackauckas
Have you checked out using the wrappers for 
TensorFlow, https://github.com/benmoran/TensorFlow.jl ? Or directly using 
PyCall?

On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to 
> perform automatic differentiation over their computational graphs. E.g. in 
> TensorFlow (example from SO 
> 
> ): 
>
> data = tf.placeholder(tf.float32)
> var = tf.Variable(...)  
> loss = some_function_of(var, data)
>
> var_grad = tf.gradients(loss, [var])[0]
>
> What is the closest thing in Julia at the moment? 
>
> Here's what I've checked so far: 
>
>  * ForwardDiff.jl  - it 
> computes derivatives using forward mode automatic differentiation (AD). 
> Although AD has particular advantages, I found this package quite slow. 
> E.g. for a vector of 1000 elements gradient takes ~100x times longer then 
> the function itself. Another potential issues is that ForwardDiff.jl 
> doesn't output symbolic version of gradient and thus is hardly usable for 
> computation on GPU, for example. 
>  * *Calculus.jl*  - among 
> other things, this package provided symbolic differentiation. However, it 
> seems to consider all symbols to be numbers and doesn't support matrices or 
> vectors. 
>
> I have pretty shallow knowledge of both these packages, so please correct 
> me if I'm wrong somewhere in my conclusions. And if not, is there any other 
> package or project that I should consider? 
>