You should use:

spark.executor.memory

from the docs <https://spark.apache.org/docs/latest/configuration.html>:
spark.executor.memory512mAmount of memory to use per executor process, in
the same format as JVM memory strings (e.g.512m, 2g).

-Todd



On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo <mulugeta.abe...@gmail.com>
wrote:

> tried that one and it throws error - extraJavaOptions is not allowed to
> alter memory settings, use spakr.executor.memory instead.
>
> On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet <
> benjamin.fra...@gmail.com> wrote:
>
>> Hi,
>>
>> You can set those parameters through the
>>
>> spark.executor.extraJavaOptions
>>
>> Which is documented in the configuration guide:
>> spark.apache.org/docs/latest/configuration.htnl
>> On 2 Jul 2015 9:06 pm, "Mulugeta Mammo" <mulugeta.abe...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I'm running Spark 1.4.0, I want to specify the start and max size (-Xms
>>> and Xmx) of the jvm heap size for my executors, I tried:
>>>
>>> executor.cores.memory="-Xms1g -Xms8g"
>>>
>>> but doesn't work. How do I specify?
>>>
>>> Appreciate your help.
>>>
>>> Thanks,
>>>
>>>
>

Reply via email to