Yep, I think that your solution would work Olivier. I am buzy this week-end
but I can push a first draft of this refactoring by the beginning of next
week.
Gilles
On Saturday, 21 January 2012, Olivier Grisel
wrote:
> 2012/1/20 Andreas :
>> On 01/20/2012 11:07 PM, [email protected] wrote:
>>> I w
2012/1/20 Andreas :
> On 01/20/2012 11:07 PM, [email protected] wrote:
>> I wonder if the Decision Tree base estimator could derive from a more
>> general base estimator for Random Forests and just, for example, override a
>> setup method or a constructor?
>>
>>
> This seems like a very good idea
On 01/20/2012 11:07 PM, [email protected] wrote:
> I wonder if the Decision Tree base estimator could derive from a more general
> base estimator for Random Forests and just, for example, override a setup
> method or a constructor?
>
>
This seems like a very good idea.
It's definitely better
I wonder if the Decision Tree base estimator could derive from a more general
base estimator for Random Forests and just, for example, override a setup
method or a constructor?
-Original Message-
From: Andreas
Date: Fri, 20 Jan 2012 22:53:40
To:
Reply-To: [email protected]
On 01/20/2012 10:45 PM, Gilles Louppe wrote:
> Yes indeed, as I said at the time, much of the forest code could be
> reused to implement a pure averaging meta-estimator.
>
> The main thing that makes BaseForest tree-specific is that it
> precomputes X_argsorted such that it is computed only once fo
Yes indeed, as I said at the time, much of the forest code could be
reused to implement a pure averaging meta-estimator.
The main thing that makes BaseForest tree-specific is that it
precomputes X_argsorted such that it is computed only once for all
trees and inject it into the fit method of the b
2012/1/20 Andreas :
> In how far is #491 tree specific?
> This is parallelization over different boot strap samples.
> Or am I missing something there?
> Feature importance (#478) is not as generic but
> "just" relies on feature importance from the base classifier,
> right?
> Or did I miss somethin
Digging up an old thread:
>> Also, I was wondering how tree-specific the random forest module is.
>> I looked at the pull request but could not find much about this.
>> Was there any discussion on this that I missed? What was the reasoning
>> behind having mixins instread of meta-classifier for bag
Hi,
> http://metaoptimize.com/qa
This is a wonderful resource.
Thanks a lot!
--
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99
On Sat, Jan 21, 2012 at 2:28 AM, Gael Varoquaux
wrote:
> Each discussion on the scipy mailing about matrices versus arrays raises
> a thread expressing incomprehension across different groups of users that
> either think that matrix objects are dangerous and to be avoided, or
> fundamental to exp
On Thu, Jan 19, 2012 at 10:48:11AM -0500, Kenneth C. Arnold wrote:
> As an aside to those who use scipy's sparse matrices: do you find it
> troublesome that scipy's sparse things behave like matrices instead of
> like ndarrays?
Yes I do.
> Of course this should be brought up on a main scipy list,
On Fri, Jan 20, 2012 at 09:40:39PM +0530, Jaidev Deshpande wrote:
> I have a conceptual question about compressed sensing, and it has
> nothing to do with Python (yet!).
Hey Jaidev,
I think that you should ask this question on metaoptimize:
http://metaoptimize.com/qa
A lot of the theory-minded s
Hi List,
I have a conceptual question about compressed sensing, and it has
nothing to do with Python (yet!).
So I don't know if it is appropriate to ask this question on this
mailing list. Please excuse me.
Suppose I have an array X with N dimensions, and after a linear
transformation I get an a
13 matches
Mail list logo