Re: [Scikit-learn-general] classification metrics understanding

2015-11-28 Thread Joel Nothman
If you are treating your Logistic Regression output as binary (i.e. not
using predict_proba or decision_function), could you please provide the
confusion matrix?

On 26 November 2015 at 05:06, Herbert Schulz  wrote:

> Hi, i think i have some "missunderstanding" due to the classification
> metric in scikit-learn
>
>
>
> i have a 2 class problem it is 1.0 or  2.0
>
>
>  precisionrecall  f1-score   support
>
> 1.0   0.86  0.76  0.81   254
> 2.0   0.49  0.65  0.5691
>
> avg / total   0.76  0.73  0.74   345
>
>
> Specificity: [ 1.*  0.35164835*  0.]
> recall,tpr,sensitivity  [ 0. * 0.24015748*  1.]
>
>
> # this part is manually computed  ( precision, sens, spec, ballanced
> accuracy )
>
> logistic regression 0.86,* 0.76, 0.65,* 0.7
>
>
>
> The   part with:
>
> Specificity: [ 1.  0.35164835  0.]
> recall,tpr,sensitivity  [ 0.  0.24015748  1.]
>
> are computed with
>
> fpr, tpr, thresholds = metrics.roc_curve(expected, predi,
> pos_label=1)
> print "Specificity:", 1-fpr
> print "recall,tpr,sensitivity",tpr
>
> Why is th speceficity for 1-fpr  are computed wtih [ 1.
> 0.35164835  0.]
>
> and not 0.65 ?
>
> Same with recall
>
>
>
>
>
>
>
>
>
>
>
> --
> Go from Idea to Many App Stores Faster with Intel(R) XDK
> Give your users amazing mobile app experiences with Intel(R) XDK.
> Use one codebase in this all-in-one HTML5 development environment.
> Design, debug & build mobile apps & 2D/3D high-impact games for multiple
> OSs.
> http://pubads.g.doubleclick.net/gampad/clk?id=254741551=/4140
> ___
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
--
___
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general


[Scikit-learn-general] classification metrics understanding

2015-11-25 Thread Herbert Schulz
Hi, i think i have some "missunderstanding" due to the classification
metric in scikit-learn



i have a 2 class problem it is 1.0 or  2.0


 precisionrecall  f1-score   support

1.0   0.86  0.76  0.81   254
2.0   0.49  0.65  0.5691

avg / total   0.76  0.73  0.74   345


Specificity: [ 1.*  0.35164835*  0.]
recall,tpr,sensitivity  [ 0. * 0.24015748*  1.]


# this part is manually computed  ( precision, sens, spec, ballanced
accuracy )

logistic regression 0.86,* 0.76, 0.65,* 0.7



The   part with:

Specificity: [ 1.  0.35164835  0.]
recall,tpr,sensitivity  [ 0.  0.24015748  1.]

are computed with

fpr, tpr, thresholds = metrics.roc_curve(expected, predi,
pos_label=1)
print "Specificity:", 1-fpr
print "recall,tpr,sensitivity",tpr

Why is th speceficity for 1-fpr  are computed wtih [ 1.
0.35164835  0.]

and not 0.65 ?

Same with recall
--
Go from Idea to Many App Stores Faster with Intel(R) XDK
Give your users amazing mobile app experiences with Intel(R) XDK.
Use one codebase in this all-in-one HTML5 development environment.
Design, debug & build mobile apps & 2D/3D high-impact games for multiple OSs.
http://pubads.g.doubleclick.net/gampad/clk?id=254741551=/4140___
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general