[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2017-04-14 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Batch Normalization 1D
*** Batch Normalization 2D ("Spatial Batch Normalization")
*** Convolution 2D ("Spatial Convolution")
*** LSTM
*** Max Pooling 2D ("Spatial Max Pooling")
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks
** Unit Tests

  was:
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Batch Normalization
*** Spatial Convolution
*** LSTM
*** Spatial Max Pooling
*** RNN
*** Spatial Batch Normalization
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks
** Unit Tests


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
> folder: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Batch Normalization 1D
> *** Batch Normalization 2D ("Spatial Batch Normalization")
> *** Convolution 2D ("Spatial Convolution")
> *** LSTM
> *** Max Pooling 2D ("Spatial Max Pooling")
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks
> ** Unit Tests



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2017-03-29 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Batch Normalization
*** Spatial Convolution
*** LSTM
*** Spatial Max Pooling
*** RNN
*** Spatial Batch Normalization
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks
** Unit Tests

  was:
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
> folder: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Batch Normalization
> *** Spatial Convolution
> *** LSTM
> *** Spatial Max Pooling
> *** RNN
> *** Spatial Batch Normalization
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks
> ** Unit Tests



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

  was:
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks
> *Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
> folder: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

*Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
folder: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *Examples*: Please see example *scripts* and *notebooks* in the {{examples}} 
> folder: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples].
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based *deep learning library* in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of a layers-based **deep learning library** in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based *deep learning library* in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based deep learning library in pure 
DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layers with simple forward/backward APIs for affine, 
convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, 
softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient 
checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based deep learning library in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-07-08 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of a layers-based **deep learning library** in 
pure DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of a layers-based deep learning library in pure 
DML.

The library contains layers with simple {{forward}} (function evaluation) and 
{{backward}} (gradient computation) functions for affine, convolution (start 
with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), dropout, 
loss functions, other layers, optimizers, and gradient checks.

*SystemML-NN*: 
[https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of a layers-based **deep learning library** in 
> pure DML.
> The library contains layers with simple {{forward}} (function evaluation) and 
> {{backward}} (gradient computation) functions for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-05-10 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layers with simple forward/backward APIs for affine, 
convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, 
softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient 
checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum
* Tests:
** Gradient Checks

  was:
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layers with simple forward/backward APIs for affine, 
convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, 
softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient 
checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of an experimental, layers-based library in 
> pure PyDML & DML that contains layers with simple forward/backward APIs for 
> affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
> sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, 
> and gradient checks.
> *SystemML-NN*: 
> [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum
> * Tests:
> ** Gradient Checks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-05-10 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layers with simple forward/backward APIs for affine, 
convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, 
softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient 
checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
_Current status:_
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum

  was:
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layers with simple forward/backward APIs for affine, 
convolution (start with 2D), max-pooling, non-linearities (relu, sigmoid, 
softmax, etc.), dropout, loss functions, other layers, optimizers, and gradient 
checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of an experimental, layers-based library in 
> pure PyDML & DML that contains layers with simple forward/backward APIs for 
> affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
> sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, 
> and gradient checks.
> *SystemML-NN*: 
> [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
> _Current status:_
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-05-10 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layer abstractions with simple forward/backward APIs 
for affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and 
gradient checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg
* Optimizers:
** Adagrad
** Adam
** RMSprop
** SGD
** SGD w/ Momentum
** SGD w/ Nesterov Momentum

  was:
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layer abstractions with simple forward/backward APIs 
for affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and 
gradient checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of an experimental, layers-based library in 
> pure PyDML & DML that contains layer abstractions with simple 
> forward/backward APIs for affine, convolution (start with 2D), max-pooling, 
> non-linearities (relu, sigmoid, softmax, etc.), dropout, loss functions, 
> other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg
> * Optimizers:
> ** Adagrad
> ** Adam
> ** RMSprop
> ** SGD
> ** SGD w/ Momentum
> ** SGD w/ Nesterov Momentum



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-05-10 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Description: 
This issue tracks the creation of an experimental, layers-based library in pure 
PyDML & DML that contains layer abstractions with simple forward/backward APIs 
for affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and 
gradient checks.

*SystemML-NN*: 
[https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
* Layers:
** Core:
*** Affine
*** Spatial Convolution
*** LSTM
*** Max Pooling
*** RNN
** Nonlinearities:
*** ReLU
*** Sigmoid
*** Softmax
*** Tanh
** Loss:
*** Cross-entropy loss
*** L1 loss
*** L2 loss
*** Log ("Logistic") loss
** Regularization:
*** Dropout
*** L1 reg
*** L2 reg

  was:This issue tracks the creation of an experimental, layers-based library 
in pure DML that contains layer abstractions with simple forward/backward APIs 
for affine, convolution (start with 2D), max-pooling, non-linearities (relu, 
sigmoid, softmax, etc.), dropout, loss functions, other layers, optimizers, and 
gradient checks.


> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> This issue tracks the creation of an experimental, layers-based library in 
> pure PyDML & DML that contains layer abstractions with simple 
> forward/backward APIs for affine, convolution (start with 2D), max-pooling, 
> non-linearities (relu, sigmoid, softmax, etc.), dropout, loss functions, 
> other layers, optimizers, and gradient checks.
> *SystemML-NN*: 
> [https://github.com/dusenberrymw/systemml-nn|https://github.com/dusenberrymw/systemml-nn]
> * Layers:
> ** Core:
> *** Affine
> *** Spatial Convolution
> *** LSTM
> *** Max Pooling
> *** RNN
> ** Nonlinearities:
> *** ReLU
> *** Sigmoid
> *** Softmax
> *** Tanh
> ** Loss:
> *** Cross-entropy loss
> *** L1 loss
> *** L2 loss
> *** Log ("Logistic") loss
> ** Regularization:
> *** Dropout
> *** L1 reg
> *** L2 reg



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (SYSTEMML-618) Deep Learning DML Library

2016-04-06 Thread Mike Dusenberry (JIRA)

 [ 
https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Dusenberry updated SYSTEMML-618:
-
Issue Type: New Feature  (was: Sub-task)
Parent: (was: SYSTEMML-540)

> Deep Learning DML Library
> -
>
> Key: SYSTEMML-618
> URL: https://issues.apache.org/jira/browse/SYSTEMML-618
> Project: SystemML
>  Issue Type: New Feature
>Reporter: Mike Dusenberry
>Assignee: Mike Dusenberry
>
> Create an experimental, layers-based library in pure DML to contain layer 
> abstractions with simple forward/backward APIs for affine, convolution (start 
> with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), 
> dropout, loss functions, other layers, optimizers, and gradient checks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)