haojin2 commented on a change in pull request #17254: [numpy] change unary 
infer type
URL: https://github.com/apache/incubator-mxnet/pull/17254#discussion_r366774634
 
 

 ##########
 File path: src/operator/tensor/elemwise_unary_op.h
 ##########
 @@ -252,6 +252,39 @@ class UnaryOp : public OpBase {
     });
   }
 
+  template<typename xpu, typename OP>
+  static void ComputeMixedType(const nnvm::NodeAttrs& attrs,
+                      const OpContext& ctx,
+                      const std::vector<TBlob>& inputs,
+                      const std::vector<OpReqType>& req,
+                      const std::vector<TBlob>& outputs) {
+    mshadow::Stream<xpu> *s = ctx.get_stream<xpu>();
+
+    if (mxnet::common::is_float(inputs[0].type_flag_)) {
+      MSHADOW_REAL_TYPE_SWITCH(outputs[0].type_flag_, DType, {
+        MSHADOW_REAL_TYPE_SWITCH(inputs[0].type_flag_, IType, {
 
 Review comment:
   this is the case when forward computation's input is floating point number, 
the output will be of the same type as the input, so you can directly call the 
original `Compute` function above, no need for launching the kernel here.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to