Does emcee implement Bayesian optimization?
What is the distribution you assume? GPs?
I thought emcee was a sampler. I need to check in with Dan ;)
On 03/09/2015 09:27 AM, Sturla Molden wrote:
For Bayesian optimization with MCMC (which I believe spearmint also
does) I have found that emcee is
For Bayesian optimization with MCMC (which I believe spearmint also
does) I have found that emcee is very nice:
http://dan.iel.fm/emcee/current/
It is much faster than naïve MCMC methods and all we need to do is
compute a callback that computes the loglikelihood given the parameter
set (which
Hi Christof.
I think implementing either the GP or SMAC approach would be good.
I talked to Jasper Snoek on Friday, possiblity the trickiest part for
the GP is the optimization of the resulting function.
Spearmint also marginalizes out the hyperparameters, which our upcoming
GP implementation
Thanks for all the release work Olivier!
On Mon, Mar 9, 2015 at 11:46 AM, Gael Varoquaux
gael.varoqu...@normalesup.org wrote:
Bravo!
Thanks for handling this.
Gaël
On Mon, Mar 09, 2015 at 04:42:12PM +0100, Olivier Grisel wrote:
The first beta for scikit-learn 0.16 is available on PyPI:
Yeah, I don't think we want to include that in the scope of the GSoC.
Using MLE parameters still works, just converges a bit slower.
On 03/09/2015 11:28 AM, Jan Hendrik Metzen wrote:
A combination of emcee with GPs (in this case the GPs from george) is
described here:
The first beta for scikit-learn 0.16 is available on PyPI:
https://pypi.python.org/pypi/scikit-learn/0.16b1
You can install it with:
pip install scikit-learn==0.16b1
or by downloading the archive and building it from source as usual.
Please feel free to report bugs on github. In particular if
A combination of emcee with GPs (in this case the GPs from george) is
described here:
http://dan.iel.fm/george/current/user/hyper/#sampling-marginalization
As PR #4270 for sklearn also exposes a method
log_marginal_likelihood(theta) in GaussianProcessRegressor, it should be
straight-forward to
Cool! Congratulations!
Michael
On Mon, Mar 9, 2015 at 4:42 PM, Olivier Grisel olivier.gri...@ensta.org
wrote:
The first beta for scikit-learn 0.16 is available on PyPI:
https://pypi.python.org/pypi/scikit-learn/0.16b1
You can install it with:
pip install scikit-learn==0.16b1
or by
Hi,
Sorry I was not aware about the patches. I have used sklearn a lot so I can
send a couple of patches in the next days hopefully this should not be a
problem.
@regarding my code
I am writing some machine learning algorithms in python for my sponsor
company. We work mainly with medium size
We wanted a bot that tells us about violations on PRs.
Not sure if landscape.io can provide that:\
https://github.com/scikit-learn/scikit-learn/issues/3888#issuecomment-76037183
ragv also looked into this, I think.
Not necessary a binary fail/pass but more like a report by a bot.
On 03/09/2015
Hi,
I'm not sure how to provide the StratifiedKFold parameter to gridsearchCV.
Should it be part of the pipeline?
Thank you,
From: Pagliari, Roberto [mailto:rpagli...@appcomsci.com]
Sent: Wednesday, February 25, 2015 8:17 PM
To: scikit-learn-general@lists.sourceforge.net
Subject: Re:
Congratulations! This has been a long time coming, and if not only for the
swathe of features it'll be great to see the documentation improvements
appearing on stable soon!
My thoughts on development priorities for the next release (and ideally to
focus on before GSoC eats everyone's brains):
We
On 03/09/2015 10:44 PM, Joel Nothman wrote:
Congratulations! This has been a long time coming, and if not only for
the swathe of features it'll be great to see the documentation
improvements appearing on stable soon!
My thoughts on development priorities for the next release (and
ideally to
13 matches
Mail list logo