Hi Herbert

The worst value for AUC is 0.5 actually. Having values close to 0 means
than you can get a value as close to 1 by just changing your predictions
(predict class 1 when you think it's 0 and vice versa). Are you sure you
didn't confuse classes somewhere along the lines? (You might have chosen
the wrong column from predict_proba's result, for example)

On Tue, Aug 4, 2015 at 4:51 PM, Herbert Schulz <hrbrt....@gmail.com> wrote:

> Hey,
>
> I'm computing the AUC for some data...
>
>
> The classification target is 1 or 0. And i have a lot of 0's ( 5600) and
> just 700 1's as a target.
>
> My AUC is about 0.097...
>
> where y_test are a vector containing 1's and 0's  and auc is containg the
> predict_proba values
>
>  roc= metrics.roc_auc_score(y_test, auc).
>
>
> Actually this value seems way to bad, because my ballance accuracy is
> about 0.77... i thought that I'm Doing maybe something wrong.
>
>
> report:
>
>              precision    recall  f1-score   support
>
>         0.0       0.95      0.91      0.93       537
>         1.0       0.49      0.63      0.55        73
>
> avg / total       0.89      0.88      0.88       610
>
>
>
>
> ------------------------------------------------------------------------------
>
> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
------------------------------------------------------------------------------
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to