benqua commented on issue #8297: [scala] Make accuracy idependant of output 
size (fix #8226)
URL: https://github.com/apache/incubator-mxnet/pull/8297#issuecomment-337598650
 
 
   @javelinjs , you're right, it changes the definition of accuracy for 
output.size > 1.
   What is the exact definition of Accuracy? I couldn't find a clear definition.
   
   This change provides a definition of accuracy that match the one from 
wikipedia for binary classification, that says: 
   _the accuracy is the proportion of true results (both true positives and 
true negatives) among the total number of cases examined_
   
(https://en.wikipedia.org/wiki/Accuracy_and_precision#In_binary_classification).
   
   It seems weird (at least to me :) ) that the accuracy depends on the output 
dimension and can grow to very large numbers. By dividing by the label 
dimension, we keep the accuracy between 0 and 1, which is the expected range of 
a "proportion".
   
   If we change sumMetric to  Double, should we do it only for the value stored 
internally and keep float in the EvalMetric API?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to