There were some issues with the gradient_descent method which have now been 
solved; thanks to Sam Lendel https://github.com/lendle for pointing them 
out.

On Wednesday, July 23, 2014 8:15:56 PM UTC+12, Alireza Nejati wrote:
>
> For about two weeks now, Zac Cranko, Pasquale Minervini, and I (Alireza 
> Nejati a.k.a. anj1) have been working on a new package for neural networks 
> in julia: NeuralNets.jl <https://github.com/anj1/NeuralNets.jl>.
>
> The goal is to create a clean, modular implementation of neural networks 
> that can easily be extended, while keeping it fast. This would not be 
> possible in a lot of other languages but it's been pretty straightforward 
> in julia so far. Currently we support a whole bunch of training methods 
> including Levenberg-Marquardt, gradient descent with momentum, and Adagrad.
>
> We have not yet released a numbered release, so a lot of things are still 
> in their preliminary stages. Especially, the documentation is incomplete in 
> parts (but you can find working examples in the examples directory). Any 
> and all feedback welcome.
>
>
>

Reply via email to