sxjscience commented on issue #18022: [Numpy] Weird bug
URL: 
https://github.com/apache/incubator-mxnet/issues/18022#issuecomment-612180240
 
 
   It's a very weird issue. Currently I think one reason is that we haven't 
correctly handled the mixed dtype:
   
   We will internally call into `BackwardUseNone`:
   
https://github.com/apache/incubator-mxnet/blob/1679adea81dd00042d3f26b9f8b1d5fa96186bbd/src/operator/tensor/elemwise_binary_broadcast_op-inl.cuh#L37-L48
   
   However, this operator uses the dtype of the output[0] as the dtype of all 
the tblobs, which will be wrong:
   
https://github.com/apache/incubator-mxnet/blob/1679adea81dd00042d3f26b9f8b1d5fa96186bbd/src/operator/tensor/elemwise_binary_op.h#L697-L699
   Then, we raise the error because some tblobs have different dtypes than 
`DType` in `BackwardUseNone_`:
   
https://github.com/apache/incubator-mxnet/blob/1679adea81dd00042d3f26b9f8b1d5fa96186bbd/src/operator/tensor/elemwise_binary_op.h#L109-L136
   
   **Important Notice!!**
   The weird thing is we should have `req = kNull` for value that has the `long 
long` type, which is `p_mask` in the example above. Thus, it should never try 
to calculate the gradient of values with the long-long type.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to