Hi Stanislaw.
Thanks for your mail.
Such discussion should go to the scikit-learn mailing list (I just
forwarded).
We had a similar discussion about SOMs recently, and the outcome was
that we want algorithms that perform well in practice.
So if you have a machine learning problem where GNG has
Hi,
in previous versions of scikit-learn I used GradientBoostingRegression
with parameters:
- loss = 'huber'
- subsample =0.8
After a sklearn update to version 0.14.1, I can use the 'huber'
loss-function only if subsamble=1.0.
For e.g. subsample=0.8 the error message below is displayed:
...
I finally found a desk and some focus. I addressed Mathieu's
suggestions and added some timings on real data (with a lot of
concessions so that it would run reasonably quick on my machine).
Here's the results: http://nbviewer.ipython.org/7224672
It becomes clear that `tol` still means different
Sergey Feldman sergeyfeldman@... writes:
Hi Sergey,
I am having the exact same issue with my data set except its 80 dimensional
and I only have two classes where the classes are unbalanced about 5:1. I
am also using random forest in sklearn. Just curious how well did using
Manish's
Hi Johannes,
The bug was fixed recently, please use the master while there is no 0.15
release.
Best,
Peter
Am 19.11.2013 16:33 schrieb hannithebunny hannithebu...@hotmail.de:
Hi,
in previous versions of scikit-learn I used GradientBoostingRegression
with parameters:
- loss = 'huber'
-
I could have a look, if only I could figure out how to create a second
fork of scikit-learn on github... The current one has the proposed
change to grid_search that I submitted.
Michal
On 19/11/13 19:15, scikit-learn-general-requ...@lists.sourceforge.net wrote:
Apparently those are all