Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/15009#discussion_r107508739
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -735,7 +749,12 @@ object SparkSubmit extends CommandLineUtils {
}
try {
- mainMethod.invoke(null, childArgs.toArray)
+ if (isSparkApp) {
+ val envvars = Map[String, String]() ++ sys.env
+ mainMethod.invoke(null, childArgs.toArray, childSparkConf,
envvars.toMap)
--- End diff --
I'm trying to remember all the discussions on this. Originally we were also
looking at 1.6 and various configs were passed by env variable. Now that isn't
true in 2.x, but there still are env variables. SPARK_CLASSPATH, SPARK_HOME,
SPARK_CONF_DIR, SPARK_LOCAL_IP, HADOOP_CONF_DIR, PYSPARK_DRIVER_PYTHON etc..
If a user wanted to change those between submitting applications the current
method makes sense, although would probably be better from user point of view
to be able to specify them explicitly rather then set in env, submit one app,
change in its own env, submit another.
@vanzin thoughts on that? Are you expecting users to not change those?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]