[ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15290212#comment-15290212
 ] 

Mike Dusenberry commented on SYSTEMML-618:
------------------------------------------

Update: Initial version submitted in PR 160 merged into the main SystemML repo 
in [commit 781d24d | 
https://github.com/apache/incubator-systemml/commit/781d24d86dea1de880c6b66a75882ecfa5f1086c].

> Deep Learning DML Library
> -------------------------
>
>                 Key: SYSTEMML-618
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-618
>             Project: SystemML
>          Issue Type: New Feature
>            Reporter: Mike Dusenberry
>            Assignee: Mike Dusenberry
>
> This issue tracks the creation of an experimental, layers-based library in 
> pure PyDML & DML that contains layers with simple forward/backward APIs for 
> affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
> sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, 
> and gradient checks.
> *SystemML-NN*: 
> [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to