Re: Spark JVM default memory

2015-05-04 Thread Vijayasarathy Kannan
[mailto:kvi...@vt.edu] > *Sent:* Monday, May 4, 2015 3:36 PM > *To:* Andrew Ash > *Cc:* user@spark.apache.org > *Subject:* Re: Spark JVM default memory > > > > I am trying to read in a file (4GB file). I tried setting both > "spark.driver.memory" and "spark

RE: Spark JVM default memory

2015-05-04 Thread Mohammed Guller
Did you confirm through the Spark UI how much memory is getting allocated to your application on each worker? Mohammed From: Vijayasarathy Kannan [mailto:kvi...@vt.edu] Sent: Monday, May 4, 2015 3:36 PM To: Andrew Ash Cc: user@spark.apache.org Subject: Re: Spark JVM default memory I am trying

Re: Spark JVM default memory

2015-05-04 Thread Vijayasarathy Kannan
I am trying to read in a file (4GB file). I tried setting both "spark.driver.memory" and "spark.executor.memory" to large values (say 16GB) but I still get a GC limit exceeded error. Any idea what I am missing? On Mon, May 4, 2015 at 5:30 PM, Andrew Ash wrote: > It's unlikely you need to increas

Re: Spark JVM default memory

2015-05-04 Thread Andrew Ash
It's unlikely you need to increase the amount of memory on your master node since it does simple bookkeeping. The majority of the memory pressure across a cluster is on executor nodes. See the conf/spark-env.sh file for configuring heap sizes, and this section in the docs for more information on

Spark JVM default memory

2015-05-04 Thread Vijayasarathy Kannan
Starting the master with "/sbin/start-master.sh" creates a JVM with only 512MB of memory. How to change this default amount of memory? Thanks, Vijay