I’m trying to run Mahout 0.10 with Spark 1.1.1.
I have input files with 8k, 10M, 20M, 25M.

So far I run with the following configuration:

8k with 1,2,3 slaves
10M with 1, 2, 3 slaves
20M with 1,2,3 slaves

But when I try to run
bin/mahout spark-itemsimilarity --master spark://node1:7077 --input
filein.txt --output out --sparkExecutorMem 6g

with 25M I got this error:

java.lang.OutOfMemoryError: Java heap space

or

java.lang.OutOfMemoryError: GC overhead limit exceeded


Is that normal? Because when I was running 20M I didn’t get any error, now
I have 5M more.

Any ideas why this is happening?

-- 
Rodolfo de Lima Viana
Undergraduate in Computer Science at UFCG

Reply via email to