Thanks. In the end, I got away with 7.3GB, I guess this is as good as it gets for now.

On 18/06/2010 14:52, Sean Owen wrote:
GDM needs roughly 28 bytes per preference, all told. I'd expect it
alone consumes "just" 2.8GB of heap on the Netflix data set. The rest
may be consumed by SVD matrices and your application storage and other
JVM stuff.

10GB seems very large. Are you setting the new generation size to be
relatively small? otherwise the defaults will probably waste a couple
gigabytes of heap.

Do what works best for you though.

On Fri, Jun 18, 2010 at 2:41 PM, Tamas Jambor<jambo...@gmail.com>  wrote:
I mean I used to run SVD with my implementation storing only three arrays
(user, item, rating), that I was able to fit in 3GB memory. I guess
GenericDataModel takes up quite a lot of memory, because the data is indexed
by users and by items, which is not necessary for SVD

On 18/06/2010 14:21, Sean Owen wrote:
Memory requirements may be much higher for this algorithm as it builds
large intermediate data structures to compute the SVD. Yes I think the
simple data fits in 3GB or so. Sounds like you have solved your
problem by supplying more memory.


Reply via email to