Hi Samir,

In the documentation there’s a link to how the coefficient of determination is 
defined: https://en.m.wikipedia.org/wiki/Coefficient_of_determination From this 
it is easy to see when the values can become negative: when the model performs 
significantly worse than the baseline (predicting average for each observation).

Common misconception is that the ‘squaredness’ is of some single value but in 
here (per CoD’s definition) it’s the ration of the squared distances of the 
baseline model and the estimated one.

Hope this helps,
-Tom

Sent on the go
________________________________
From: scikit-learn <scikit-learn-bounces+drabas.t=gmail....@python.org> on 
behalf of Samir K Mahajan <samirkmahajan1...@gmail.com>
Sent: Wednesday, August 11, 2021 12:16:34 PM
To: scikit-learn@python.org <scikit-learn@python.org>
Subject: [scikit-learn] Regarding negative value of sklearn.metrics.r2_score 
and sklearn.metrics.explained_variance_score

Dear All,
I am amazed to find  negative  values of  sklearn.metrics.r2_score and 
sklearn.metrics.explained_variance_score in a model ( cross validation of OLS 
regression model)
However, what amuses me more  is seeing you justifying   negative  
'sklearn.metrics.r2_score ' in your documentation.  This does not make sense to 
me . Please justify to me how squared values are negative.

Regards,
Samir K Mahajan.

_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to