Depends.
If you give "fit_params" to cross_val_score it will be passed to
GridSearchCV in the correct way, I believe.
On 10/30/2015 06:36 AM, Christoph Sawade wrote:
Thanks for the response. I am actually interested in the new
DisjointLabelKFold
(https://github.com/scikit-learn/scikit-learn/
Thanks for the response. I am actually interested in the new
DisjointLabelKFold (https://github.com/scikit-learn/scikit-learn/pull/)
which depends on an additional label. This use case seems to be not yet
covered in the new sklearn.model_selection, is it?
> Changes to support this case have re
Changes to support this case have recently been merged into master, and an
example is on its way:
https://github.com/scikit-learn/scikit-learn/issues/5589
I think you should be able to run your code by importing GridSearchCV,
cross_val_score and StratifiedShuffleSplit from the new
sklearn.model_se
Hey there!
A general purpose in machine learning when training a model is to estimate
also the performance. This is often done via cross validation. In order to
tune also hyperparameters one might want to nest the crossvalidation loops
into another. The sklearn framework makes that very easy. Howe