It is if you use the grad boost variant. I'll work on it next week on 
vacation...

Sent from my iPad

On May 24, 2011, at 4:48 PM, Ted Dunning <[email protected]> wrote:

> Is AdaBoost a scalable algorithm?
> 
> It seems to me that it is inherently very sequential.
> 
> On Tue, May 24, 2011 at 1:57 PM, Wojciech Indyk 
> <[email protected]>wrote:
> 
>> Hi!
>> I want implement AdaBoost in Mahout. Could it be useful in Mahout? I
>> think so, because it's strong algorithm and very powerful, but Mahout
>> is specific, so who knows :)
>> I thought about training data and I know, that I must parallelize by
>> data, rather than by algorithms, so it will be not so easy - I must
>> run all mapers of chosed algorithms in my training maper, but I have
>> no idea how could i choose algorithms to adaboost (in architecture
>> way) like a parameter.
>> 
>> Regards
>> 

Reply via email to