Sorry, should have been more clear.  I was referring to if one is using a user 
based recommender (e.g GenericUserBasedRecommender) vs. item based recommender. 
 Our general recommendation is that user based approaches won't scale, I was 
wondering what the general cutoff is on a single machine, more or less.  Is it 
still 100M data points, roughly speaking?


On Oct 26, 2011, at 8:57 AM, Sean Owen wrote:

> Limits in terms of scalability? If you mean, how much can you fit on
> one machine without Hadoop, I usually say 100M data points or so.
> Beyond that you can go as big as you like, but on Hadoop.
> 
> On Wed, Oct 26, 2011 at 1:56 PM, Grant Ingersoll <[email protected]> wrote:
>> I seem to recall past discussions on where one hits the bottleneck w/ user 
>> based recommendation approaches in Mahout, but I can't seem to locate it 
>> anymore.   Anyone know off hand?  Where do user based approaches hit their 
>> limits, more or less?
>> 
>> Thanks,
>> Grant
>> 


Reply via email to