pengzhao-intel commented on issue #10089: enable all activations in MKLDNN.
URL: https://github.com/apache/incubator-mxnet/pull/10089#issuecomment-372908964
 
 
   Look at the code, two implementations are the difference for the soft_relu. 
So, we get the different results.
   
   mxnet:
   
https://github.com/apache/incubator-mxnet/blob/c9ec3118688c233a66ad847003a9e8d2d09e5952/src/operator/mshadow_op.h#L136
   
   ```
   /*! \brief SoftReLU, also known as softplus activation */
   struct softrelu : public mxnet_op::tunable {
     template<typename DType>
     MSHADOW_XINLINE static DType Map(DType a) {
       // Avoid overflow of exp for large inputs.
       // Thresholds 20.0 is chosen such that softrelu(a) = a
       // for a > 20 using floating precision
       if (a > DType(20.0f)) {
         return a;
       } else {
         return DType(math::log1p(math::exp(a)));
       }
     }
   };
   
   MXNET_UNARY_MATH_OP(softrelu_grad, -math::expm1(-a));
   
   ```
   
   mkldnn:
   
https://github.com/intel/mkl-dnn/blob/f5218ff4fd2d16d13aada2e632afd18f2514fee3/tests/gtests/test_eltwise.cpp#L101
   
   
   ```
   template <typename T>
   T soft_relu_fwd(T s) {
       return logf(1 + ::expf(s));
   }
   
   template <typename T>
   T soft_relu_bwd(T dd, T s) {
       return dd / (1 + ::expf(-s));
   }
   ```
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to