Hi Andy,
Thanks for the help, but I really just had issues with linux. I reinstall
everything and it is working now.
Cheers,
From: Andy [mailto:[email protected]]
Sent: Friday, July 25, 2014 3:58 PM
To: [email protected]
Subject: Re: [Scikit-learn-general] GridSearchVC
On 07/23/2014 06:21 PM, Pagliari, Roberto wrote:
Hi Michael,
Thanks for the clarifications.
Is there a way to make prediction, once grid search is done? Right now
I'm getting the error
'GridSearchCV' object has no attribute 'best_estimator_'
And I've seen other people reporting the same e
I used pip, after installing all required libraries, including fortran.
-Original Message-
From: Lars Buitinck [mailto:[email protected]]
Sent: Wednesday, July 23, 2014 3:56 PM
To: scikit-learn-general
Subject: Re: [Scikit-learn-general] GridSearchVC with SVM
2014-07-23 21:31 GMT+02:00
2014-07-23 21:31 GMT+02:00 Pagliari, Roberto :
> It says 0.15.0
>
> Right now I am finding the optimal values manually, using cross_validation
> (by picking the best average).
That can't be right. This attribute was in place in at least 0.14.0.
How did you install scikit-learn?
-
-general] GridSearchVC with SVM
2014-07-23 18:21 GMT+02:00 Pagliari, Roberto :
> Is there a way to make prediction, once grid search is done? Right now
> I’m getting the error
>
> 'GridSearchCV' object has no attribute 'best_estimator_'
Works fine here. What does `p
2014-07-23 18:21 GMT+02:00 Pagliari, Roberto :
> Is there a way to make prediction, once grid search is done? Right now I’m
> getting the error
>
> 'GridSearchCV' object has no attribute 'best_estimator_'
Works fine here. What does `python -c 'import sklearn;
print(sklearn.__version__)` say?
2014-07-23 18:07 GMT+02:00 Michael Eickenberg :
> To answer 1): yes, if you set cv=number, then it will do K-fold
> cross-validation with that number of folds. You can do this explicitly by
> using
>
> from sklearn.cross_validation import KFold
>
> cv = KFold(len(data), 6)
>
> and pass cv as an arg
a minimal example of k-fold cross-validation with prediction?
Thank you,
From: Michael Eickenberg [mailto:[email protected]]
Sent: Wednesday, July 23, 2014 12:08 PM
To: [email protected]
Subject: Re: [Scikit-learn-general] GridSearchVC with SVM
To answer 1)
To answer 1): yes, if you set cv=number, then it will do K-fold
cross-validation with that number of folds. You can do this explicitly by
using
from sklearn.cross_validation import KFold
cv = KFold(len(data), 6)
and pass cv as an argument to GridSearchCV.
To answer question 2 I think we need s
This is an example about how to perform gridsearch with SVM.
>>> from sklearn import svm, grid_search, datasets
>>> iris = datasets.load_iris()
>>> parameters = {'kernel':('linear', 'rbf'), 'C':[1, 10]}
>>> svr = svm.SVC()
>>> clf = grid_search.GridSearchCV(svr, parameters)
>>> clf.fit(iris.data,
10 matches
Mail list logo