I don't think you can do that in the Standalone mode before 1.5.
The best you can do is to have multi workers per box. One worker can and will 
only start one executor, before Spark 1.5.
What you can do is to set "SPARK_WORKER_INSTANCES", which control how many 
worker instances you can start per box.
Yong 

Date: Mon, 28 Sep 2015 20:47:18 -0700
Subject: Re: Setting executors per worker - Standalone
From: james.p...@gmail.com
To: zjf...@gmail.com
CC: user@spark.apache.org

Thanks for your reply.
Setting it as 
--conf spark.executor.cores=1 
when I start spark-shell (as an example application) indeed sets the number of 
cores per executor as 1 (which is 4 before), but I still have 1 executor per 
worker. What I am really looking for is having 1 worker with 4 executor (each 
with one core) per machine when I run my application. Based one the 
documentation it seems it is feasible, but it is not clear as how.
Thanks.
On Mon, Sep 28, 2015 at 8:46 PM, Jeff Zhang <zjf...@gmail.com> wrote:
use "--executor-cores 1" you will get 4 executors per worker since you have 4 
cores per worker


On Tue, Sep 29, 2015 at 8:24 AM, James Pirz <james.p...@gmail.com> wrote:
Hi,
I am using speak 1.5 (standalone mode) on a cluster with 10 nodes while each 
machine has 12GB of RAM and 4 cores. On each machine I have one worker which is 
running one executor that grabs all 4 cores. I am interested to check the 
performance with "one worker but 4 executors per machine - each with one core".
I can see that "running multiple executors per worker in Standalone mode" is 
possible based on the closed issue:
https://issues.apache.org/jira/browse/SPARK-1706

But I can not find a way to do that. "SPARK_EXECUTOR_INSTANCES" is only 
available for the Yarn mode, and in the standalone mode I can just set 
"SPARK_WORKER_INSTANCES" and "SPARK_WORKER_CORES" and "SPARK_WORKER_MEMORY".
Any hint or suggestion would be great.



-- 
Best Regards

Jeff Zhang


                                          

Reply via email to