[
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Arvind Surve updated SYSTEMML-618:
----------------------------------
Sprint: Sprint 1
> Deep Learning DML Library
> -------------------------
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
> Issue Type: New Feature
> Reporter: Mike Dusenberry
> Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and
> {{backward}} (gradient computation) functions for affine, convolution (start
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.),
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *Examples*: Please see example *scripts* and *notebooks* in the {{examples}}
> folder:
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].
> *SystemML-NN*:
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Batch Normalization 1D
> *** Batch Normalization 2D ("Spatial Batch Normalization")
> *** Convolution 2D ("Spatial Convolution")
> *** LSTM
> *** Max Pooling 2D ("Spatial Max Pooling")
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks
> ** Unit Tests
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)