imators_forest, feature_importance_sum_forest)
for idx, imp in enumerate(forest_feat_imp):
print "imp for tree %i: %.20f" % (idx, imp)
I suppose in each tree there is a small rounding error, summing up to the
overall error.
So is this a bug or an inevitable rounding issue?
Greet
Hello everyone,
I’d like to bring this up again to see if people have any thoughts on it.
If you also think this is a bug, then we can track it and get it fixed. Please
share your opinions.
Thank you,
-Doug
From: Douglas Chan
Sent: Wednesday, August 31, 2016 4:52 PM
To: Scikit-learn user
2016 11:28 PM
To: Scikit-learn user and developer mailing list
Subject: Re: [scikit-learn] Gradient Boosting: Feature Importances do not sum
to 1
Can you provide a reproducible example?
Raphael
On Wednesday, August 31, 2016, Douglas Chan wrote:
Hello everyone,
I notice conditions when Feat
Hello everyone,
I notice conditions when Feature Importance values do not add up to 1 in
ensemble tree methods, like Gradient Boosting Trees or AdaBoost Trees. I
wonder if there’s a bug in the code.
This error occurs when the ensemble has a large number of estimators. The
exact conditions de