In the field of reinforcement learning (RL), the Fitted-Q algorithm of
Ernst 2005 (http://www.jmlr.org/papers/volume6/ernst05a/ernst05a.pdf)
relies on the ability to fix the tree structure to ensure convergence (see
p. 515 of the JMLR paper).

The warm_start option  is useful, but does not fully allow for the freezing
mechanism to take place.

Fitted-Q is highly used in RL and adding a freezing option would definitely
receive a lot of interest. On the other, I understand that for the sake of
keeping the interface general this might not be possible.

My understanding of ``tree.py`` is that such a thing might be achievable
with a custom ``Splitter`` that actually doesn't split anything but only
refreshes the leaves.

Is there an easier workaround ?

Best,
Pierre-Luc
------------------------------------------------------------------------------
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration & more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to