It has been a year and a half since I wrote the first version of this post 
<https://groups.google.com/d/topic/julia-users/rSgtUpcYPbE/discussion> and 
it is time for an update, just in time with the Julia 0.5.0 release!  

Knet <https://github.com/denizyuret/Knet.jl> (pronounced “kay-net”) is the 
Koç University deep learning framework implemented in Julia. The latest 
version (0.8.0) is finally registered as an official Julia package.  Unlike 
gradient generating compilers like Theano and TensorFlow which force users 
into a restricted mini-language, Knet allows the definition and training of 
machine learning models using the full power and expressivity of Julia. 
 Models are defined by describing only the forward calculation in plain 
Julia using helper functions, loops, conditionals, recursion, closures, 
tuples and dictionaries, array indexing and concatenation and almost 
everything else Julia offers. High performance is achieved by combining 
automatic differentiation of most of Julia with efficient GPU kernels and 
memory management. The computations can be performed on the GPU by simply 
using KnetArray instead of Array for parameters and data. To find out more 
and see some examples check out the README 
<https://github.com/denizyuret/Knet.jl/blob/master/README.rst>.

best,
deniz

Reply via email to