Re: spark setting maximum available memory

2014-05-22 Thread Mayur Rustagi
Ideally you should use less.. 75 % would be good to leave enough for scratch space for shuffle writes & system processes. Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi On Fri, May 23, 2014 at 1:41 AM, Andrew Or wrote: >

Re: spark setting maximum available memory

2014-05-22 Thread Andrew Or
Hi Ibrahim, If your worker machines only have 8GB of memory, then launching executors with all the memory will leave no room for system processes. There is no guideline, but I usually leave around 1GB just to be safe, so conf.set("spark.executor.memory", "7g") Andrew 2014-05-22 7:23 GMT-07:00

spark setting maximum available memory

2014-05-22 Thread İbrahim Rıza HALLAÇ
In my situation each slave has 8 GB memory. I want to use the maximum memory that I can: .set("spark.executor.memory", "?g") How can I determine the amount of memory I should set ? It fails when I set it to 8GB.