Re: [Scikit-learn-general] Custom function in decision-tree based classifiers

2013-11-17 Thread Gilles Louppe
Hi, On 17 November 2013 22:45, Thomas Dent wrote: > Hi Gilles - > > thanks for the reply. I think changing the relative class weights does > more or less what we want, which is to optimize the classification at very > low false alarm probability. > > Another question on the DecisionTreeClassifi

Re: [Scikit-learn-general] Custom function in decision-tree based classifiers

2013-11-17 Thread Thomas Dent
Hi Gilles - thanks for the reply. I think changing the relative class weights does more or less what we want, which is to optimize the classification at very low false alarm probability. Another question on the DecisionTreeClassifier, does the argument splitter='best' actually do anything?

Re: [Scikit-learn-general] Custom function in decision-tree based classifiers

2013-11-07 Thread Gilles Louppe
Hi Thomas, Indeed, gini and entropy are the only supported impurity criteria for classification. I don't think we have plans right now to add others - which one do you have in mind? > how feasible would it be to have the option of passing custom function to the tree or forest to use in splitting?

[Scikit-learn-general] Custom function in decision-tree based classifiers

2013-11-07 Thread Thomas Dent
Hi, the only current options for deciding on feature splits in trees / forests are 'entropy' and 'gini', two questions on this: - is anyone planning on implementing others? - how feasible would it be to have the option of passing custom function to the tree or forest to use in splitting? W