John1231983 opened a new issue #18333:
URL: https://github.com/apache/incubator-mxnet/issues/18333


   Given a softmax classified feature ff (`Bx2xHxW`). And a target label size 
of `Bx1xHxW` . I want to implement cross entropy loss using symbol only. This 
is my implementation
   ```
   # target size of Bx1xHxW
   target_squeeze = mx.symbol.squeeze(target, axis=1) #size of BxHxW
   target_squeeze = mx.sym.one_hot(target_squeeze, depth = 2, on_value = -1.0, 
off_value = 0.0) 
   # Transpose from BxHxWx2 to Bx2xHxW
    target_squeeze = mx.symbol.transpose(target_squeeze, axes=(0,3,1,2))
   # Get log of feature f
   f_log  = mx.sym.log(f)
   batch_size =32
   f_sum = mx.symbol.sum(target_squeeze * f_log)/batch_size
   f_sum = mx.symbol.MakeLoss(f_sum, name = 'loss_ce')
   ```
   Is my implementation correct? If not, please correct it for me. Thanks in 
advantage


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to