joddiy commented on a change in pull request #480: SINGA-474 ELU operator
URL: https://github.com/apache/incubator-singa/pull/480#discussion_r309687031
 
 

 ##########
 File path: python/singa/autograd.py
 ##########
 @@ -358,6 +358,48 @@ def relu(x):
     return ReLU()(x)[0]
 
 
+class Elu(Operation):
+    def __init__(self,alpha=1):
+        super(Elu, self).__init__()
+        self.alpha=alpha
+
+    def forward(self, x):
+        """Do forward propgation.
+        Store the x if requires gradient.
+        Args:
+            x (CTensor): matrix
+        Returns:
+            a CTensor for the result
+        """
+        #f(x) = alpha * (exp(x) - 1.) for x < 0, f(x) = x for x >= 0
+        if training:
+            self.input = x
+        x1 = singa.LTFloat(x, 0.0)
+        x1 = singa.__mul__(x, x1)
+        x1 = singa.SubFloat(singa.Exp(x1),self.alpha)
+        x2 = singa.ReLU(x)
+        x1 = singa.__add__(x1, x2)
+        return x1
 
 Review comment:
   It's a little weird to call ReLU here, can we implement it directly?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to