RE: spark standalone with multiple executors in one work node

2015-03-05 Thread Judy Nash
[mailto:so...@cloudera.com] Sent: Thursday, February 26, 2015 2:11 AM To: Judy Nash Cc: user@spark.apache.org Subject: Re: spark standalone with multiple executors in one work node --num-executors is the total number of executors. In YARN there is not quite the same notion of a Spark worker. Of

Re: spark standalone with multiple executors in one work node

2015-02-26 Thread Sean Owen
--num-executors is the total number of executors. In YARN there is not quite the same notion of a Spark worker. Of course, one worker has an executor for each running app, so yes, but you mean for one app? it's possible, though not usual, to run multiple executors for one app on one worker. This ma

Re: spark standalone with multiple executors in one work node

2015-02-25 Thread bit1...@163.com
My understanding is if you run multi applications on the work node, then each application will have an executorbackend process and an executor as well. bit1...@163.com From: Judy Nash Date: 2015-02-26 09:58 To: user@spark.apache.org Subject: spark standalone with multiple executors in one

spark standalone with multiple executors in one work node

2015-02-25 Thread Judy Nash
Hello, Does spark standalone support running multiple executors in one worker node? It seems yarn has the parameter --num-executors to set number of executors to deploy, but I do not find the equivalent parameter in spark standalone. Thanks, Judy