szha commented on a change in pull request #9662: Gluon PReLU, ELU, SELU, Swish
URL: https://github.com/apache/incubator-mxnet/pull/9662#discussion_r165807547
 
 

 ##########
 File path: src/operator/leaky_relu-inl.h
 ##########
 @@ -177,9 +182,20 @@ class LeakyReLUOp : public Operator {
       case leakyrelu::kPReLU: {
         weight = in_data[leakyrelu::kGamma].get<xpu, 1, real_t>(s);
         grad_weight = in_grad[leakyrelu::kGamma].get<xpu, 1, real_t>(s);
-        grad_weight = sumall_except_dim<1>(F<prelu_grad>(data) * grad);
-        gdata = F<mshadow_op::xelu_grad>(data, 
mshadow::expr::broadcast<1>(weight, data.shape_))
-                * grad;
+        if (weight.shape_[0] == 1) {
 
 Review comment:
   @bradcar sorry that I missed your comment earlier, and thanks for sharing 
your work. In this PR I'd like to first focus on wrapping up the previous two 
PRs for activations. Since you wrote the paper, would you like to implement 
that in mxnet?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to