In 1.0+ you can just pass the --executor-memory flag to ./bin/spark-shell.

On Fri, Jun 6, 2014 at 12:32 AM, Oleg Proudnikov
<oleg.proudni...@gmail.com> wrote:
> Thank you, Hassan!
>
>
> On 6 June 2014 03:23, hassan <hellfire...@gmail.com> wrote:
>>
>> just use -Dspark.executor.memory=
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Setting-executor-memory-when-using-spark-shell-tp7082p7103.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
>
>
> --
> Kind regards,
>
> Oleg
>

Reply via email to