On Sun, Sep 21, 2014 at 1:55 AM, Olivier Grisel <olivier.gri...@ensta.org>
wrote:

> On a related note, here is an implementeation of Logistic Regression
> applied to one-hot features obtained from leaf membership info of a
> GBRT model:
>
>
> http://nbviewer.ipython.org/github/ogrisel/notebooks/blob/master/sklearn_demos/Income%20classification.ipynb#Using-the-boosted-trees-to-extract-features-for-a-Logistic-Regression-model
>
> This is inspired by this paper from Facebook:
> https://www.facebook.com/publications/329190253909587/ .
>
> It's easy to implement and seems to work quite well.
>

What is the advantage of this method over using GBRT directly?

Mathieu


>
> --
> Olivier
> http://twitter.com/ogrisel - http://github.com/ogrisel
>
------------------------------------------------------------------------------
Slashdot TV.  Video for Nerds.  Stuff that Matters.
http://pubads.g.doubleclick.net/gampad/clk?id=160591471&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to