week and I got these links from Andy's response
(thanks Andy!)
-Jason
From: Pagliari, Roberto [mailto:[email protected]]
Sent: Tuesday, April 14, 2015 3:08 PM
To: [email protected]
Subject: Re: [Scikit-learn-general] adaboost parameters
hi Jason/Andreas,
ed in the video. I don't
know other tips or rule of thumbs are available.
Thanks,
From: Jason Wolosonovich [[email protected]]
Sent: Monday, April 13, 2015 10:47 PM
To: [email protected]
Subject: Re: [Scikit-learn-general] adaboost
oject.
-Jason
From: Andreas Mueller [mailto:[email protected]]
Sent: Monday, April 13, 2015 3:31 PM
To: [email protected]
Subject: Re: [Scikit-learn-general] adaboost parameters
You might consider using gradient boosting instead.
see https://www.youtube.com/watch?v=IXZKgIsZRm0
:* Re: [Scikit-learn-general] adaboost parameters
What is your dataset like? How are you building your individual
classifier that you are ensembling with AdaBoost? A common-use case
would be boosted decision stumps (one-level decision trees).
http://en.wikipedia.org/wiki/Decision_stump
http
ks,
From: Jason Wolosonovich [mailto:[email protected]]
Sent: Saturday, April 11, 2015 9:13 AM
To: [email protected]
Subject: Re: [Scikit-learn-general] adaboost parameters
What is your dataset like? How are you building your individual classifier that
you are ensem
i, Roberto [mailto:[email protected]]
Sent: Friday, April 10, 2015 1:18 PM
To: [email protected]
Subject: [Scikit-learn-general] adaboost parameters
When using adaboost, what is a range of values of n_estimators and learning
rate that makes sense to optimize over?
When using adaboost, what is a range of values of n_estimators and learning
rate that makes sense to optimize over?
Thank you,
--
BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
Develop your own process in a