> > Will this let us run SGDClassifier and show us per-class probability 
outputs?
> > Again, that's the only reason we've been using OneVsRestClassifier. Let me
> > explain what I mean by per-class probability, just in case it isn't clear:
> >
> > SGDClassifier's predict_proba() returns probability of belonging to each 
class,
> > so if for example there are five classes, it will return something like
> > [0.5, 0.3, 0.1, 0.05, 0.05].
> > For our use case, though, we need a negative/positive probability display, 
i.e.
> > [(0.4, 0.5), (0.7, 0.3), (0.8, 0.2), (0.9, 0.1), (0.6, 0.4)] for five 
classes
> > showing the probability that the input does not belong/does belong to that
> > class, respectively.
> >
> Yes, if you don't normalize.
> You are aware that this is inconsistent when you are doing multi-class, 
> not multi-label, right?
> It there is only one correct label, it can not be label 2 with 
> probability .7 and label 3 with probability .8.
> 

Those are "does not belong"/"does belong" pairs; the first number is the
probability that the input is NOT part of the class. :)

I'm starting to understand what you mean; the "[(0.4, 0.5), (0.7, 0.3), ..." 
values are achieved by taking the sigmoid of each value in the decision 
function, right? And if I then normalize that, I'll get something in the form 
of 
"[0.5, 0.3, 0.1, 0.05, 0.05]"? Apologies, I'm still new to some of this stuff!

Thanks, Afik



------------------------------------------------------------------------------
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to