On Wed, Nov 16, 2011 at 11:38, Andreas Müller <[email protected]> wrote:
>
>>> - a class for regression and one for classification
>>> - MSE and cross entropy (for classification only) loss functions
>> We need several loss functions and there gradient in cython (we cannot
>> reuse the loss function from the SGD module of since the output of a
>> MLP can be a multi-variate). For classification we will need hnigeloss
>> and squared hingeloss (and hubert for regression). See the source of
>> libsgd for a list of useful loss function.
>>
>>
> Can you explain how hinge-loss works for multiple classes?
> Or would you train a separate mlp for each class?

Usually the multiclass hinge loss minimizes max(0, 1 + max_(c !=
correct_class)(score(c, x)) - score(correct class, x)). That is, the
correct class must have score 1 higher than any other classes (or,
equivalently, than the highest-scoring of all other classes).

-- 
 - Alexandre

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure 
contains a definitive record of customers, application performance, 
security threats, fraudulent activity, and more. Splunk takes this 
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to