Hi,
On 17 November 2013 22:45, Thomas Dent wrote:
> Hi Gilles -
>
> thanks for the reply. I think changing the relative class weights does
> more or less what we want, which is to optimize the classification at very
> low false alarm probability.
>
> Another question on the DecisionTreeClassifi
Hi Gilles -
thanks for the reply. I think changing the relative class weights does more or
less what we want, which is to optimize the classification at very low false
alarm probability.
Another question on the DecisionTreeClassifier, does the argument
splitter='best'
actually do anything?
Hi Thomas,
Indeed, gini and entropy are the only supported impurity criteria for
classification. I don't think we have plans right now to add others - which
one do you have in mind?
> how feasible would it be to have the option of passing custom function to
the tree or forest to use in splitting?