[
https://issues.apache.org/jira/browse/SINGA-10?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14899904#comment-14899904
]
ASF subversion and git services commented on SINGA-10:
------------------------------------------------------
Commit ba3b1a5c70a813090ede30b0d5c95caa08e94da8 in incubator-singa's branch
refs/heads/master from [~flytosky]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=ba3b1a5 ]
SINGA-10 Add Support for Recurrent Neural Networks (RNN)
* Move the functions of WordLayer into the EmbeddingLayer.
* Make DatLayer a subclass of both RNNLayer and DataLayer.
* create_shard.cc wraps the WordRecord inside singa::Record, and inserts
singa::Record into the DataShard.
* Make the inheritance of base layer classes like InputLayer, NeuronLayer, etc.
virutal
from Layer to compilation problems if a future layer is declare to
inherit two base layer.
* Update the documentation on the website for RNNLM example.
Need to optimize the training speed. The PPL is similar to that from
RNNLM Toolkit.
> Add Support for Recurrent Neural Networks (RNN)
> -----------------------------------------------
>
> Key: SINGA-10
> URL: https://issues.apache.org/jira/browse/SINGA-10
> Project: Singa
> Issue Type: New Feature
> Reporter: wangwei
> Assignee: Zheng Kaiping
>
> The training algorithm for RNNs is Back-Propagation through time (BPTT). It
> is similar to the BP algorithm for feed-forward neural networks.
> The model structures are quite different to feed-forward models. Hence, we
> may need to inherit the base NeuralNet class to create a RNN class. The RNN
> class overrides the SetupNeurlNet function to:
> 1. parse user configuration and create the RNN graph with (circles)
> 2. broke the circles and expand it through time.
> 3. create and setup layers
> Model partitioning id not considered in this ticket.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)