haojin2 commented on a change in pull request #14673: [BUGFIX] fix ELU function
will appear nan when calculating the gradient
URL: https://github.com/apache/incubator-mxnet/pull/14673#discussion_r274750867
##########
File path: python/mxnet/gluon/nn/activations.py
##########
@@ -158,7 +158,8 @@ def __init__(self, alpha=1.0, **kwargs):
self._alpha = alpha
def hybrid_forward(self, F, x):
- return F.where(x > 0, x, self._alpha * (F.exp(x) - 1.0))
+ _x = F.where(x < 0, x, F.zeros_like(x))
+ return F.where(x > 0, x, self._alpha * (F.exp(_x) - 1.0))
Review comment:
Thanks for your contribution!
Actually we do have ELU implemented in the backend so you can call it
directly
Please use:
```Python
return F.LeakyReLU(x, act_type='elu', slope=self._alpha)
```
here instead and also you want to change:
https://github.com/apache/incubator-mxnet/blob/master/tests/python/unittest/test_gluon.py#L1183
to:
```Python
return mx.nd.expm1(x) if x <= 0.0 else x
```
so that the test would still pass after this change.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services