Hello all,

I just read the release announcement, congratulations! One new caught
my attention was: Regression Trees/Forests which support multiple
outputs. Can someone point out any reference (papers) which this
implementation was based on?

For a while in the past I experimented with the Multivariate random
forest which is described here:
http://onlinelibrary.wiley.com/doi/10.1002/widm.12/abstract.

Basically, the idea is that when choosing the feature to split the
algorithm accounts for the covariance matrix of response variables*. I
would like to know if if sklearn's implementation follow's the same
approach.

*For example, the squared Mahalanobis distance could be used for
regression. This is the same as squared error with one output only.

--
Flavio

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to