> - discuss the with the tree growers guys on how to best parallelize
> random forest trainings on multi-core without copying the training set
> in memory
>    - either with threads in joblib and "with nogil" statements in the
> inner loops of the (new) cython code
>    - either with shared memory and the (in)famous PR #44 of joblib

Indeed, I would love to see this properly fixed in one way or another.

Otherwise, on my part, I plan to complete PR #2131 if it is not yet
merged in by the time of the sprint, and then address tree-related
issues/PRs that have been lying around for months now. Also, if
someone has a special request for the ensemble/tree modules, now is
the time to ask :)

Besides that, I plan to dedicate a good deal of my time in
fixing/closing random issues or reviewing pending PRs. The list has
grown so much lately, we need to cut that!

And of course I'll help Nicolas on his GSoC.

------------------------------------------------------------------------------
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to