On Mon, Jan 19, 2015 at 6:29 AM, Akhil Das <[email protected]> wrote:
> Its the executor memory (spark.executor.memory) which you can set while
> creating the spark context. By default it uses 0.6% of the executor memory

(Uses 0.6 or 60%)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to