chrishkchris edited a comment on issue #685:
URL: https://github.com/apache/singa/pull/685#issuecomment-617508136


   One point to notice is that:
   
   Currently in SINGA, ReLU is stateless (except we record the input for 
backward propagation), so we can just use y = autograd.relu(y) without manually 
initializing a ReLU python object.
   
   On the other hand, if we create a ReLU object manually, we have to the take 
the element [0] following 
https://github.com/apache/singa/blob/master/python/singa/autograd.py#L448
   
   All in all, personally I suggest we directly use y = autograd.relu(y) instead


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to