On Mon, Jan 19, 2015 at 6:29 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote: > Its the executor memory (spark.executor.memory) which you can set while > creating the spark context. By default it uses 0.6% of the executor memory
(Uses 0.6 or 60%) --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org