Hi Julian, Unfortunately we do not have yet a generic (working for any classifier) way to search for optimal hyperparameters (it is in our TODO). ModelSelector was crafted by Emanuele for GPR.
BUT it is quite easy to implement basic hyperparameters selection via a grid search through the nested cross-validation procedure. E.g. in http://github.com/PyMVPA/PyMVPA/blob/master/doc/examples/nested_cv.py we have an example which sweeps through all classifiers in the warehouse to determine the 'best' for every cross-validation split. In your scenario it is virtually the same -- just that you are using the same classifier, and vary the hyperparameters. Hope this of some help Yaroslav On Wed, 06 Oct 2010, Karch, Julian wrote: > Hello, > i'm currently working on a project where i have to select hyperparameters for > the libSVM implementation which generate a model which fits a dataset best > (compared to other hyperparameters). Is there any implementation available in > pymvpa accomplishes this job? I found Modelselector but as i unterstood the > logic it needs the classifier/model to have a method called > compute_log_marginal_likelihood which the SVM clfers seem to lag. > Julian Karch > Research Assistant -- .-. =------------------------------ /v\ ----------------------------= Keep in touch // \\ (yoh@|www.)onerussian.com Yaroslav Halchenko /( )\ ICQ#: 60653192 Linux User ^^-^^ [175555] _______________________________________________ Pkg-ExpPsy-PyMVPA mailing list Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa