[ https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15278693#comment-15278693 ]
Mike Dusenberry commented on SYSTEMML-618: ------------------------------------------ Updates: * The library is currently written entirely in the DML (R-like) syntax. I am about to start adding a PyDML version that will become the new main codebase. I plan to maintain both versions in the same repo, with DML & PyDML versions co-located in the repo. * There is early support for (vanilla) RNNs and LSTMs. The plan is to expand on both of these. > Deep Learning DML Library > ------------------------- > > Key: SYSTEMML-618 > URL: https://issues.apache.org/jira/browse/SYSTEMML-618 > Project: SystemML > Issue Type: New Feature > Reporter: Mike Dusenberry > Assignee: Mike Dusenberry > > This issue tracks the creation of an experimental, layers-based library in > pure PyDML & DML that contains layers with simple forward/backward APIs for > affine, convolution (start with 2D), max-pooling, non-linearities (relu, > sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, > and gradient checks. > *SystemML-NN*: > [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn] > _Current status:_ > * Layers: > ** Core: > *** Affine > *** Spatial Convolution > *** LSTM > *** Max Pooling > *** RNN > ** Nonlinearities: > *** ReLU > *** Sigmoid > *** Softmax > *** Tanh > ** Loss: > *** Cross-entropy loss > *** L1 loss > *** L2 loss > *** Log ("Logistic") loss > ** Regularization: > *** Dropout > *** L1 reg > *** L2 reg > * Optimizers: > ** Adagrad > ** Adam > ** RMSprop > ** SGD > ** SGD w/ Momentum > ** SGD w/ Nesterov Momentum -- This message was sent by Atlassian JIRA (v6.3.4#6332)