[ 
https://issues.apache.org/jira/browse/SINGA-60?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14702987#comment-14702987
 ] 

ASF subversion and git services commented on SINGA-60:
------------------------------------------------------

Commit 6afa895b8ea060a532ea01f1f4484c9db11a2496 in incubator-singa's branch 
refs/heads/master from Wei Wang
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=6afa895 ]

SINGA-60 Make learning rate and param init modular

Created a base class for getting learning rate, which is changed during
training.
Created a base class for initializing parameter values.

SINGA comes with a couple of built-in implementations for the two base
classes.
Users can also implement their own learning rate changing methods and
parameter initializing methods by extending the correponding base
classes.


> Make learning rate and param init modular
> -----------------------------------------
>
>                 Key: SINGA-60
>                 URL: https://issues.apache.org/jira/browse/SINGA-60
>             Project: Singa
>          Issue Type: Improvement
>            Reporter: wangwei
>
> The learning rate of SGD typically changes through time.
> There are many different ways to change the learning rate of SGD. SINGA has 
> implemented a couple of changing methods. But users may want to implement 
> their own changing method. To make this part modular, this ticket is going to 
> create a base learning rate generator, e.g. called LRGen, which is declared 
> like,
> {code}
> class LRGenerator {
>   public:
>     virtual float Get (int step) = 0;
>   protected:
>    LRProto proto_;
> };
> {code}
> Users can then inherit LRGenerator to implement the their own changing 
> algorithm in the `Get(int step)`.
> Users can also add configurations for their generator by extending the base 
> LRProto.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to