[ 
https://issues.apache.org/jira/browse/SINGA-140?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15213804#comment-15213804
 ] 

Zheng Kaiping commented on SINGA-140:
-------------------------------------

Hi Raunaq Abhyankar,

Thanks for your interest. As explained in the description, the problem 
occurs in parameter sharing when layers are unrolled.
By fixing the bug, we can correct the updating of parameters and obtain 
better performance.

Because the bug affects parameter sharing, you can fix the bug first and 
then check correctness logically.

Best regards,
Kaiping




> A bug in CollectAll() function when layers are unrolled in neuralnet
> --------------------------------------------------------------------
>
>                 Key: SINGA-140
>                 URL: https://issues.apache.org/jira/browse/SINGA-140
>             Project: Singa
>          Issue Type: Bug
>            Reporter: Zheng Kaiping
>
> In SINGA_HOME/src/work.cc, in “int Worker::CollectAll(int step, NeuralNet* 
> net){}” function, the layers which are unrolled (except for the first one) 
> should not collect parameters, due to parameter sharing.
> Previous:
> if (layer->partition_id() == id_)
> Possible changes:
> if (layer->partition_id() == id_ && layer->unroll_index() == 0)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to