Could be a bug introduced by a recent modification of the Confusion matrix. Are you using trunk? Can you provide a patch that fixes the issue?

Best,
Sebastian


On 05/25/2014 05:05 PM, Michael Christopher wrote:
Hi

With the Confusion Matrix below I get a Kappa value of 0,8023. Actually
it should be 1, because there are no wrong classifications.

I debugged and found out that it happens, because the samples variable
is two times incremented in the "putCount" method
(org.apache.mahout.classifier.ConfusionMatrix). Commenting out this
incrementation lead to the Kappa value of 1.

Is this a bug or do I something something?


=======================================================
Confusion Matrix
-------------------------------------------------------
a        b        <--Classified as
6        0         |  6         a     = 1_YES
0        9         |  9         b     = 2_NO

=======================================================
Statistics
-------------------------------------------------------
Kappa                                       0,8023
Accuracy                                       100%
Reliability                                66,6667%
Reliability (standard deviation)            0,5774






Reply via email to