Hi Alexandre,
I recently gave a look to the subject as well.
In "Parameter determination of support vector machine and feature
selection using simulated annealing approach" [1] a stochastic optimization
method that has nice theoretical properties [2] is used to optimize at the
same time both feature selection and rbf svm hyper-parameters.
Starting from there I verified that all stochastic and heuristics-based
methods could be effectively used to optimized both problems (feature
selection, hyperparameters optimization, or both at the same time).
There are many papers on the subject...
>From what I gleaned the method proposed in [1] is very simple to implement
and to "trim".
Other interesting optimization methods that represent the state of art are:
- Particle Swarm Optimization
- Cuckoo search
Paolo
[1] http://www.csie.ntu.edu.tw/~b92103/sdarticle.pdf
[2] - guaranteed to converge to the maximum
- the search is more "dense" around the local and global maxima
On Tue, Nov 15, 2011 at 4:06 AM, Alexandre Passos <[email protected]>wrote:
> Hello, scikiters,
>
> Recent work by James Bergstra demonstrated that careful hyperparameter
> optimization, as well as careless random sampling, is often better
> than manual searching for many problems. You can see results in the
> following nips paper:
> http://people.fas.harvard.edu/~bergstra/files/pub/11_nips_hyperopt.pdf
>
> I wonder if there's interest in adding some simple versions of these
> techniques to the scikit's very useful GridSearchCV? There is code
> available https://github.com/jaberg/hyperopt but it seems to be
> research code and it uses theano, so it's not applicable to the
> scikit.
>
------------------------------------------------------------------------------
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general