No it is the macro average of the per-class f1, i.e. an arithmetic mean
over harmonic means of P & R per class

On Fri., 29 Mar. 2019, 9:53 am Max Halford, <maxhalfor...@gmail.com> wrote:

> Hey everyone,
>
> I've stumbled upon an inconsistency with the F1 score and I can't seem to
> get around it. I have two lists y_true = [0, 1, 2, 2, 2] and y_pred = [0,
> 0, 2, 2, 1]. sklearn tells me that the macro-averaged F1 score is
> 0.488888... If I understand correctly the macro-average F1 score is the
> harmonic mean of the macro-average precision score and the macro-average
> recall score. sklearn tells me that the macro-average precision is 0.5
> whilst the macro-average recall is 0.555555... If use the
> statistics.harmonic_mean function from Python's standard library this gives
> me around 0.526315.
>
> So which is correct: 0.488888 or 0.526315? I apologize in advance if I've
> overlooked something silly.
>
> Best regards.
>
> --
> Max Halford
> +336 28 25 13 38
> _______________________________________________
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to