Thanks, Abel.

Best,
Yifan LI
On Jul 21, 2014, at 4:16 PM, Abel Coronado Iruegas <acoronadoirue...@gmail.com> 
wrote:

> Hi Yifan
> 
> This works for me:
> 
> export SPARK_JAVA_OPTS="-Xms10g -Xmx40g -XX:MaxPermSize=10g"
> export ADD_JARS=/home/abel/spark/MLI/target/MLI-assembly-1.0.jar
> export SPARK_MEM=40g
> ./spark-shell 
> 
> 
> Regards
> 
> 
> On Mon, Jul 21, 2014 at 7:48 AM, Yifan LI <iamyifa...@gmail.com> wrote:
> Hi,
> 
> I am trying to load the Graphx example dataset(LiveJournal, 1.08GB) through 
> Scala Shell on my standalone multicore machine(8 cpus, 16GB mem), but an 
> OutOfMemory error was returned when below code was running,
> 
> val graph = GraphLoader.edgeListFile(sc, path, minEdgePartitions = 
> 16).partitionBy(PartitionStrategy.RandomVertexCut)
> 
> I guess I should set some parameters to JVM? like "-Xmx5120m"
> But how to do this in Scala Shell? 
> I directly used the "bin/spark-shell" to start spark and seems everything 
> works correctly in WebUI.
> 
> Or, I should do parameters setting at somewhere(spark-1.0.1)?
> 
> 
> 
> Best,
> Yifan LI
> 

Reply via email to