Github user ConeyLiu commented on the issue:
https://github.com/apache/spark/pull/17859
It only affect the YARN mode, just see the follow code:
```
OptionAssigner(args.numExecutors, YARN, ALL_DEPLOY_MODES, sysProp =
"spark.executor.instances"),
```
But from now latest code, this parameter affect both `YARN client` & `YARN
cluster`, this change after
[SPARK-9092](https://github.com/apache/spark/pull/7657), but the comment in the
config template didn't make the corresponding changes.
And also, this parameter is set in `spark-env.sh`, so it can only can be
fetched by the method
of `env.get()` according to the class of `SparkSubmitArguments`. However,
now the parameter is set by the follow code:
```
numExecutors = Option(numExecutors)
.getOrElse(sparkProperties.get("spark.executor.instances").orNull)
```
So only you set it in by `--num-executors` or defined the
configure(`spark-default.conf` or `--conf`).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]