[mailto:so...@cloudera.com]
Sent: Thursday, February 26, 2015 2:11 AM
To: Judy Nash
Cc: user@spark.apache.org
Subject: Re: spark standalone with multiple executors in one work node
--num-executors is the total number of executors. In YARN there is not quite
the same notion of a Spark worker
Hello,
Does spark standalone support running multiple executors in one worker node?
It seems yarn has the parameter --num-executors to set number of executors to
deploy, but I do not find the equivalent parameter in spark standalone.
Thanks,
Judy