Neutron3529 commented on pull request #18423: URL: https://github.com/apache/incubator-mxnet/pull/18423#issuecomment-643600093
> also, could you add a test case for the expected consistency? I am so sorry that `I don't know where to add the test case`. as what is documented for [SoftmaxCrossEntropyLoss](https://mxnet.apache.org/api/python/docs/api/gluon/loss/index.html#mxnet.gluon.loss.SoftmaxCrossEntropyLoss) and [KLDivLoss](https://mxnet.apache.org/api/python/docs/api/gluon/loss/index.html#mxnet.gluon.loss.KLDivLoss) the return value of `SoftmaxCrossEntropyLoss()(pred,label)` and `KLDivLoss(from_logits=False)(pred,label.one_hot(number_of_classes))` should be the same. i.e., the following assertion must be true ``` import mxnet as mx from mxnet.gluon.loss import * label=mx.nd.array([0,1,2,3,4,0,1,2,3,4]) pred=mx.nd.random.normal(shape=(10,5)) number_of_classes=pred.shape[1] #using 1+...-1==0 to assure that only rounding errors would happened. assert(1+(SoftmaxCrossEntropyLoss()(pred,label)-KLDivLoss(from_logits=False)(pred,label.one_hot(number_of_classes))).abs().mean()-1==0) ``` ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
