On Thu, Mar 22, 2012 at 9:09 AM, David Warde-Farley
<[email protected]> wrote:

> <side rant> In particular, doing 1 vs rest for logistic regression seems like 
> an odd choice when there is a perfectly good multiclass generalization of 
> logistic regression. Mathieu clarified to me last night how liblinear is 
> calculating "probabilities" in the multiclass case, and it seems insane to 
> me, from a calibration perspective (normalizing a bunch of things by their 
> sum does not make them probabilities in any meaningful sense!).

Excellent point. The reason why you can normalize the probabilities
with the softmax in the case of multiclass logistic regression is
because the softmax is used in the *objective function*. Here, they
normalize probabilities although this behavior is never enforced
during the learning! (with one-vs-rest, by definition each binary
classifier is independent !)

Mathieu

------------------------------------------------------------------------------
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to