I've seen some simple examples of neural networks on Github but nothing
seems to be complete and/or reliable. If I had infinite time, I would like
to experiment with ForwardDiff.jl to do automatic differentiation of
expressions to generate backprop expressions for arbitrary MLP
architectures (something similar to what Theano does, even though Theano
does much more than that).

--
João Felipe Santos


On Sat, Jun 7, 2014 at 12:11 PM, John Myles White <[email protected]>
wrote:

> I don’t think there’s any reliable neural network Julia code published so
> far.
>
>  — John
>
> On Jun 7, 2014, at 1:18 AM, [email protected] wrote:
>
> >
> > I am just wondering if there is code for multi layer perceptron anywhere
> in julia with parallel processing and iRprop+ for gradient decent?
>
>

Reply via email to