[ https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mike Dusenberry updated SYSTEMML-618: ------------------------------------- Description: This issue tracks the creation of a layers-based *deep learning library* in pure DML. The library contains layers with simple {{forward}} (function evaluation) and {{backward}} (gradient computation) functions for affine, convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient checks. *SystemML-NN*: [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN] _Current status:_ * Layers: ** Core: *** Affine *** Spatial Convolution *** LSTM *** Max Pooling *** RNN ** Nonlinearities: *** ReLU *** Sigmoid *** Softmax *** Tanh ** Loss: *** Cross-entropy loss *** L1 loss *** L2 loss *** Log ("Logistic") loss ** Regularization: *** Dropout *** L1 reg *** L2 reg * Optimizers: ** Adagrad ** Adam ** RMSprop ** SGD ** SGD w/ Momentum ** SGD w/ Nesterov Momentum * Tests: ** Gradient Checks was: This issue tracks the creation of a layers-based **deep learning library** in pure DML. The library contains layers with simple {{forward}} (function evaluation) and {{backward}} (gradient computation) functions for affine, convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient checks. *SystemML-NN*: [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN] _Current status:_ * Layers: ** Core: *** Affine *** Spatial Convolution *** LSTM *** Max Pooling *** RNN ** Nonlinearities: *** ReLU *** Sigmoid *** Softmax *** Tanh ** Loss: *** Cross-entropy loss *** L1 loss *** L2 loss *** Log ("Logistic") loss ** Regularization: *** Dropout *** L1 reg *** L2 reg * Optimizers: ** Adagrad ** Adam ** RMSprop ** SGD ** SGD w/ Momentum ** SGD w/ Nesterov Momentum * Tests: ** Gradient Checks > Deep Learning DML Library > ------------------------- > > Key: SYSTEMML-618 > URL: https://issues.apache.org/jira/browse/SYSTEMML-618 > Project: SystemML > Issue Type: New Feature > Reporter: Mike Dusenberry > Assignee: Mike Dusenberry > > This issue tracks the creation of a layers-based *deep learning library* in > pure DML. > The library contains layers with simple {{forward}} (function evaluation) and > {{backward}} (gradient computation) functions for affine, convolution (start > with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), > dropout, loss functions, other layers, optimizers, and gradient checks. > *SystemML-NN*: > [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN] > _Current status:_ > * Layers: > ** Core: > *** Affine > *** Spatial Convolution > *** LSTM > *** Max Pooling > *** RNN > ** Nonlinearities: > *** ReLU > *** Sigmoid > *** Softmax > *** Tanh > ** Loss: > *** Cross-entropy loss > *** L1 loss > *** L2 loss > *** Log ("Logistic") loss > ** Regularization: > *** Dropout > *** L1 reg > *** L2 reg > * Optimizers: > ** Adagrad > ** Adam > ** RMSprop > ** SGD > ** SGD w/ Momentum > ** SGD w/ Nesterov Momentum > * Tests: > ** Gradient Checks -- This message was sent by Atlassian JIRA (v6.3.4#6332)