Following on from the previous post, I thought (from reading only and
accepting no prior experience with AdaBoost) that the main goal of
AdaBoost was to combine weak classifiers (e.g. a depth-restricted
DecisionTree) rather than building an ensemble of strong classifiers
(as in e.g. a RandomForest).

The example on the site:
http://scikit-learn.org/dev/auto_examples/ensemble/plot_forest_iris.html
uses DecisionTrees with max_depth=None for each of the 4 classifiers.
Using a depth restricted classifier (e.g. max_depth=3) for AdaBoost
results in the same classification quality in this example.

Might the example say more about AdaBoost's ability to use weak
classifiers if we used a restricted depth DecisionTree?

Ian.

--
Ian Ozsvald (A.I. researcher)
i...@ianozsvald.com

http://IanOzsvald.com
http://MorConsulting.com/
http://Annotate.IO
http://SocialTiesApp.com/
http://TheScreencastingHandbook.com
http://FivePoundApp.com/
http://twitter.com/IanOzsvald
http://ShowMeDo.com

------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to