Hi,

When I ran the following code:

X, y = make_classification(n_samples=100)
clf = GradientBoostingClassifier(random_state=0).fit(X, y)
imp=clf.feature_importances_
print "The sum of feature importances:", sum(imp)

The sum of feature importances is not always equal to 1. So do you have a
nice explanation for this situation? Besides, if a tree only contains a
root, could we say all its feature importances are 0?  I guess the root
trees will influence sum of feature importances. Is it right?

Best,
Enhui
_______________________________________________
scikit-learn mailing list
[email protected]
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to