Thanks for all your answers! Just to make it clear, at the moment I'm not
really interested in TensorFlow itself, but specifically in its automatic
differentiation capabilities.
ReverseDiffSource.jl looks very promising and is indeed quite fast for `R^n
-> R` in a few experiments I've made.
There is also
https://github.com/mlubin/ReverseDiffSparse.jl
I've never used it myself, but I thought i'd throw it out there.
On Saturday, July 9, 2016 at 12:27:33 AM UTC-7, Gabriel Goh wrote:
>
> Forward differentiation has a bad complexity for functions of the form R^n
> -> R. try using
Forward differentiation has a bad complexity for functions of the form R^n
-> R. try using ReverseDiffSource.jl instead
This blog posts describes positive results using ReverseDiffSource.jl on an
autoencoder
Have you checked out using the wrappers for
TensorFlow, https://github.com/benmoran/TensorFlow.jl ? Or directly using
PyCall?
On Friday, July 8, 2016 at 5:02:55 PM UTC-7, Andrei Zh wrote:
>
> In Python, libraries like TensorFlow or Theano provide possibility to
> perform automatic