Hi, all,

I was wondering if it somehow possible to define a loss function to the Naive 
Bayes classifier in scikit-learn. For example, let's assume that we are 
interested in spam vs. ham classification. In this context, such a loss 
function would be useful to lower the False Positive rate (i.e., classifying 
ham as spam, which is "worse" than classifying spam as ham)

For simplicity, I have an example using random data from Gaussian 
(http://nbviewer.ipython.org/github/rasbt/pattern_classification/blob/master/stat_pattern_class/supervised/parametric/5_stat_superv_parametric.ipynb)

Best,
Sebastian
------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to