One needs to define what is the definition of weak learner. In boosting, if I recall well the literature, weak learner refers to learner which unfit performing slightly better than a random learner. In this regard, a tree with shallow depth will be a weak learner and is used in adaboost or gradient boosting. However, in random forest the tree used are trees that overfit (deep tree) so they are not weak for the same reason. However, one will never be able to do what a forest will do with a single tree. In this regard, a single tree is weaker than the forest. However, I never read the term for "weak learner" in the context of the random forest. Sent from my phone - sorry to be brief and potential misspell.
Hello guys,
The the following reference states that Random Forests uses weak learners: The random forest starts with a standard machine learning technique called a “decision tree” which, in ensemble terms, corresponds to our weak learner. ... Thus, in ensemble terms, the trees are weak learners and the random forest is a strong learner. I completely disagree with that statement. But I would like the opinion of the community to double check if I am not missing something. |
_______________________________________________ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn