Re: Setting only master heap

2014-10-26 Thread Keith Simmons
Hi Guys, Here's some lines from the log file before the OOM. They don't look that helpful, so let me know if there's anything else I should be sending. I am running in standalone mode. spark-pulse-org.apache.spark.deploy.master.Master-1-hadoop10.pulse.io.out.5:java.lang.OutOfMemoryError: Java h

Re: Setting only master heap

2014-10-23 Thread Nan Zhu
h… my observation is that, master in Spark 1.1 has higher frequency of GC…… Also, before 1.1, I never encounter GC overtime in Master, after upgrade to 1.1, I have met for 2 times (we upgrade soon after 1.1 release)…. Best, -- Nan Zhu On Thursday, October 23, 2014 at 1:08 PM, Andre

Re: Setting only master heap

2014-10-23 Thread Andrew Or
Yeah, as Sameer commented, there is unfortunately not an equivalent `SPARK_MASTER_MEMORY` that you can set. You can work around this by starting the master and the slaves separately with different settings of SPARK_DAEMON_MEMORY each time. AFAIK there haven't been any major changes in the standalo

Re: Setting only master heap

2014-10-22 Thread Sameer Farooqui
Hi Keith, Would be helpful if you could post the error message. Are you running Spark in Standalone mode or with YARN? In general, the Spark Master is only used for scheduling and it should be fine with the default setting of 512 MB RAM. Is it actually the Spark Driver's memory that you intende

Setting only master heap

2014-10-22 Thread Keith Simmons
We've been getting some OOMs from the spark master since upgrading to Spark 1.1.0. I've found SPARK_DAEMON_MEMORY, but that also seems to increase the worker heap, which as far as I know is fine. Is there any setting which *only* increases the master heap size? Keith