Hi,

I don't know much about you particular use case, but most (if not all) of
the Spark command line parameters can also be specified as properties.
You should try to use

SparkLauncher.setConf("spark.executor.instances", "3")

HTH,
Luc

Luc Bourlier
*Spark Team  - Typesafe, Inc.*
luc.bourl...@typesafe.com

<http://www.typesafe.com>

On Wed, Oct 21, 2015 at 4:10 AM, qinggangwa...@gmail.com <
qinggangwa...@gmail.com> wrote:

> Hi all,
>  I want to launch spark job on yarn by java, but it seemes that there is
> no way to set numExecutors int the class SparkLauncher. Is there any way to
> set numExecutors ?
> Thanks
>
> ------------------------------
> qinggangwa...@gmail.com
>

Reply via email to