It should work with any model that uses gradient based optimization.  Let
me know if you find otherwise.

On Tue, Sep 20, 2016 at 11:19 PM dnm <rtemp...@gmail.com> wrote:

> Very cool! Does it work with distributions.jl so we can write
> probabilistic models?
>
>
> On Tuesday, September 20, 2016 at 2:31:31 PM UTC-4, Deniz Yuret wrote:
>>
>> It has been a year and a half since I wrote the first version of this
>> post
>> <https://groups.google.com/d/topic/julia-users/rSgtUpcYPbE/discussion>
>> and it is time for an update, just in time with the Julia 0.5.0 release!
>>
>> Knet <https://github.com/denizyuret/Knet.jl> (pronounced “kay-net”) is
>> the Koç University deep learning framework implemented in Julia. The latest
>> version (0.8.0) is finally registered as an official Julia package.  Unlike
>> gradient generating compilers like Theano and TensorFlow which force users
>> into a restricted mini-language, Knet allows the definition and training of
>> machine learning models using the full power and expressivity of Julia.
>> Models are defined by describing only the forward calculation in plain
>> Julia using helper functions, loops, conditionals, recursion, closures,
>> tuples and dictionaries, array indexing and concatenation and almost
>> everything else Julia offers. High performance is achieved by combining
>> automatic differentiation of most of Julia with efficient GPU kernels and
>> memory management. The computations can be performed on the GPU by simply
>> using KnetArray instead of Array for parameters and data. To find out more
>> and see some examples check out the README
>> <https://github.com/denizyuret/Knet.jl/blob/master/README.rst>.
>>
>> best,
>> deniz
>>
>>

Reply via email to