om>
> wrote:
>
>> Have you tried passing --executor-cores or –total-executor-cores as
>> arguments, , depending on the spark version?
>>
>>
>>
>>
>>
>> *From:* kant kodali [mailto:kanth...@gmail.com]
>> *Sent:* Friday, February 17, 2017
cores or –total-executor-cores as
>>> arguments, , depending on the spark version?
>>>
>>>
>>>
>>>
>>>
>>> *From:* kant kodali [mailto:kanth...@gmail.com]
>>> *Sent:* Friday, February 17, 2017 5:03 PM
>>> *To:* Alex K
t; arguments, , depending on the spark version?
>>
>>
>>
>>
>>
>> *From:* kant kodali [mailto:kanth...@gmail.com]
>> *Sent:* Friday, February 17, 2017 5:03 PM
>> *To:* Alex Kozlov <ale...@gmail.com>
>> *Cc:* user @spark <user@spark.apach
t;
>
>
> *From:* kant kodali [mailto:kanth...@gmail.com]
> *Sent:* Friday, February 17, 2017 5:03 PM
> *To:* Alex Kozlov <ale...@gmail.com>
> *Cc:* user @spark <user@spark.apache.org>
> *Subject:* Re: question on SPARK_WORKER_CORES
>
>
>
> Standalone.
>
Have you tried passing --executor-cores or –total-executor-cores as arguments,
, depending on the spark version?
From: kant kodali [mailto:kanth...@gmail.com]
Sent: Friday, February 17, 2017 5:03 PM
To: Alex Kozlov <ale...@gmail.com>
Cc: user @spark <user@spark.apache.org>
Subject:
Standalone.
On Fri, Feb 17, 2017 at 5:01 PM, Alex Kozlov wrote:
> What Spark mode are you running the program in?
>
> On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote:
>
>> when I submit a job using spark shell I get something like this
>>
>> [Stage
What Spark mode are you running the program in?
On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote:
> when I submit a job using spark shell I get something like this
>
> [Stage 0:>(36814 + 4) / 220129]
>
>
> Now all I want is I want to increase number of parallel
It’s actually not that tricky.
SPARK_WORKER_CORES: is the max task thread pool size of the of the executor,
the same saying of “one executor with 32 cores and the executor could execute
32 tasks simultaneously”. Spark doesn’t care about how much real physical
CPU/Cores you have (OS does), so