szha commented on issue #8027: Optional reshape of predictions in Perplexity
metric
URL: https://github.com/apache/incubator-mxnet/pull/8027#issuecomment-359669294
Yes, I was looking into the metrics, and I didn't feel like having a
separate perplexity metric was necessary in gluon. The reason is that
perplexity is usually just the exp(cross_entropy_loss). Since
cross_entropy_loss is required for training and often used in testing, getting
perplexity is just about calculating its exponential. Having a separate
perplexity in that case could mean unnecessarily repeated computation.
Our separation of metric and loss was an artifact of past design choices.
Given that gluon.loss already covers many metrics, I'm not sure how much value
it would add to have a separate metric package. The only case that can be
useful is for the metrics with non-differentiable calculation, such as those
involving argmax and topk (acc, P, R, F1, etc.)
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org
With regards,
Apache Git Services