Since mxnet 1.3.1, cpu with dropout is supported in mx.symbol.RNN, which 
provides a more user friendly support for RNN. For example, the whole 
architecture could be built from scratch with: 

```
data <- mx.symbol.Variable("data")
label <- mx.symbol.Variable("label")

data <- mx.symbol.swapaxes(data, dim1 = 0, dim2 = 1)
RNN <- mx.symbol.RNN(data = data, 
                     state_size = 50, 
                     num_layers = 1,
                     p = 0.2,
                     state_outputs = T,
                     mode = "lstm")
data <- mx.symbol.swapaxes(data = RNN[[1]], dim1 = 0, dim2 = 1)
decode <- mx.symbol.FullyConnected(data = data, num_hidden = 1, flatten = F)
loss <- mx.symbol.LinearRegressionOutput(data = decode, label = label)
```

I think the above would provide a less intimidating introduction to RNN. 

[ Full content available at: 
https://github.com/apache/incubator-mxnet/pull/12664 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to