Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16975#discussion_r101887589
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -466,7 +466,7 @@ object SparkSubmit extends CommandLineUtils {
// Other options
OptionAssigner(args.executorCores, STANDALONE | YARN,
ALL_DEPLOY_MODES,
sysProp = "spark.executor.cores"),
- OptionAssigner(args.executorMemory, STANDALONE | MESOS | YARN,
ALL_DEPLOY_MODES,
+ OptionAssigner(args.executorMemory, ALL_CLUSTER_MGRS,
ALL_DEPLOY_MODES,
--- End diff --
The inconsistency is already inherent with the parameters in
`local-cluster[]`, so I'm not introducing it here with this change. I
personally think it's a really bad interface to force the user set executor
memory in two different places and require that these two values match.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]