Could you give a reference for gradient boosting with fully corrective
updates?
Since the philosophy of gradient boosting is to fit each tree against the
residuals (or negative gradient) so far, I am wondering how such fully
corrective update would work...
Mathieu
On Tue, Sep 16, 2014 at 9:16 AM, c TAKES <ctakesli...@gmail.com> wrote:
> Is anyone working on making Gradient Boosting Regressor work with sparse
> matrices?
>
> Or is anyone working on adding an option for fully corrective gradient
> boosting, I.E. all trees in the ensemble are re-weighted at each iteration?
>
> These are things I would like to see and may be able to help with if no
> one is currently working on them.
>
>
> ------------------------------------------------------------------------------
> Want excitement?
> Manually upgrade your production database.
> When you want reliability, choose Perforce.
> Perforce version control. Predictably reliable.
>
> http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce.
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general