Re: question on SPARK_WORKER_CORES

2017-02-18 Thread Yan Facai
om> > wrote: > >> Have you tried passing --executor-cores or –total-executor-cores as >> arguments, , depending on the spark version? >> >> >> >> >> >> *From:* kant kodali [mailto:kanth...@gmail.com] >> *Sent:* Friday, February 17, 2017

Re: question on SPARK_WORKER_CORES

2017-02-17 Thread kant kodali
cores or –total-executor-cores as >>> arguments, , depending on the spark version? >>> >>> >>> >>> >>> >>> *From:* kant kodali [mailto:kanth...@gmail.com] >>> *Sent:* Friday, February 17, 2017 5:03 PM >>> *To:* Alex K

Re: question on SPARK_WORKER_CORES

2017-02-17 Thread Alex Kozlov
t; arguments, , depending on the spark version? >> >> >> >> >> >> *From:* kant kodali [mailto:kanth...@gmail.com] >> *Sent:* Friday, February 17, 2017 5:03 PM >> *To:* Alex Kozlov <ale...@gmail.com> >> *Cc:* user @spark <user@spark.apach

Re: question on SPARK_WORKER_CORES

2017-02-17 Thread kant kodali
t; > > > *From:* kant kodali [mailto:kanth...@gmail.com] > *Sent:* Friday, February 17, 2017 5:03 PM > *To:* Alex Kozlov <ale...@gmail.com> > *Cc:* user @spark <user@spark.apache.org> > *Subject:* Re: question on SPARK_WORKER_CORES > > > > Standalone. >

RE: question on SPARK_WORKER_CORES

2017-02-17 Thread Satish Lalam
Have you tried passing --executor-cores or –total-executor-cores as arguments, , depending on the spark version? From: kant kodali [mailto:kanth...@gmail.com] Sent: Friday, February 17, 2017 5:03 PM To: Alex Kozlov <ale...@gmail.com> Cc: user @spark <user@spark.apache.org> Subject:

Re: question on SPARK_WORKER_CORES

2017-02-17 Thread kant kodali
Standalone. On Fri, Feb 17, 2017 at 5:01 PM, Alex Kozlov wrote: > What Spark mode are you running the program in? > > On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote: > >> when I submit a job using spark shell I get something like this >> >> [Stage

Re: question on SPARK_WORKER_CORES

2017-02-17 Thread Alex Kozlov
What Spark mode are you running the program in? On Fri, Feb 17, 2017 at 4:55 PM, kant kodali wrote: > when I submit a job using spark shell I get something like this > > [Stage 0:>(36814 + 4) / 220129] > > > Now all I want is I want to increase number of parallel

RE: Question about SPARK_WORKER_CORES and spark.task.cpus

2015-06-22 Thread Cheng, Hao
It’s actually not that tricky. SPARK_WORKER_CORES: is the max task thread pool size of the of the executor, the same saying of “one executor with 32 cores and the executor could execute 32 tasks simultaneously”. Spark doesn’t care about how much real physical CPU/Cores you have (OS does), so