Hello guys,
The the following reference states that Random Forests uses weak learners:
-
https://blog.citizennet.com/blog/2012/11/10/random-forests-ensembles-and-performance-metrics#:~:text=The%20random%20forest%20starts%20with,corresponds%20to%20our%20weak%20learner.&text=Thus%2C%20in%20ensemble%
Hi,
What are you wondering?
The individual tree is weakened by design (accepts more errors), so
indeed, the individual trees are weak learners and the combination of
them (the forest) becomes the strong learner.
You can have a strong tree as well (deeper, more parameters), but
that's not what is s
One needs to define what is the definition of weak learner.In boosting, if I recall well the literature, weak learner refers to learner which unfit performing slightly better than a random learner. In this regard, a tree with shallow depth will be a weak learner and is used in adaboost or gradien
As previously mentioned, a "weak learner" is just a learner that barely
performs better than random. It's more common in the context of
boosting, but I think weak learning predates boosting, and the original
RF paper by Breiman does make reference to "weak learners":
It's interesting that Fore
In my opinion the reference is distorting a concept that has a consolidated
definition in the community. I am also familiar with the definition of WL
as "an estimator slightly better than guessing", mostly decision stumps (
https://en.m.wikipedia.org/wiki/Decision_stump), which is not an component
> As previously mentioned, a "weak learner" is just a learner that barely
performs better than random.
To continue with what the definition of a random learner refers to, does it
mean the following contexts?
(1) Classification: a learner which uniformly samples from one of the N
endpoints in the t