Hi Gilles -
thanks for the reply. I think changing the relative class weights does more or
less what we want, which is to optimize the classification at very low false
alarm probability.
Another question on the DecisionTreeClassifier, does the argument
splitter='best'
actually do
Hi,
the only current options for deciding on feature splits in trees / forests are
'entropy' and 'gini', two questions on this:
- is anyone planning on implementing others?
- how feasible would it be to have the option of passing custom function to
the tree or forest to use in splitting?
Hi Thomas,
Indeed, gini and entropy are the only supported impurity criteria for
classification. I don't think we have plans right now to add others - which
one do you have in mind?
how feasible would it be to have the option of passing custom function to
the tree or forest to use in splitting?