Thanks, Gael,
that's very useful information.

I will do some hyperparameter tuning via GridSearch on the alpha and priors 
then for using roc_auc as scoring metric and see how it goes.

Best,
Sebastian 

On Aug 31, 2014, at 5:10 PM, Gael Varoquaux <[email protected]> 
wrote:

> On Sat, Aug 30, 2014 at 03:53:24PM -0400, Sebastian Raschka wrote:
>> I was wondering if it somehow possible to define a loss function to the 
>> Naive Bayes classifier in scikit-learn.
> 
> No.
> 
>> For example, let's assume that we are interested in spam vs. ham
>> classification. In this context, such a loss function would be useful
>> to lower the False Positive rate (i.e., classifying ham as spam, which
>> is "worse" than classifying spam as ham)
> 
> You can shift the ratio of errors in one class vs the other by accessing
> directly the output of 'predict_proba' and thresholding at a different
> value than equal probability for each class.
> 
> HTH,
> 
> Gael
> 
> ------------------------------------------------------------------------------
> Slashdot TV.  
> Video for Nerds.  Stuff that matters.
> http://tv.slashdot.org/
> _______________________________________________
> Scikit-learn-general mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general


------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to