Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19631#discussion_r149451263
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala ---
@@ -398,9 +399,20 @@ private[spark] object RestSubmissionClient {
val PROTOCOL_VERSION = "v1"
/**
- * Submit an application, assuming Spark parameters are specified
through the given config.
- * This is abstracted to its own method for testing purposes.
+ * Filter non-spark environment variables from any environment.
*/
+ private[rest] def filterSystemEnvironment(env: Map[String, String]):
Map[String, String] = {
+ env.filterKeys { k =>
+ // SPARK_HOME is filtered out because it is usually wrong on the
remote machine (SPARK-12345)
+ (k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED" && k !=
"SPARK_HOME") ||
+ k.startsWith("MESOS_")
--- End diff --
I don't know, but this should be the exact same behavior as before, no? I
didn't really change this code.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]