Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19631#discussion_r149375998
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala ---
    @@ -398,9 +399,20 @@ private[spark] object RestSubmissionClient {
       val PROTOCOL_VERSION = "v1"
     
       /**
    -   * Submit an application, assuming Spark parameters are specified 
through the given config.
    -   * This is abstracted to its own method for testing purposes.
    +   * Filter non-spark environment variables from any environment.
        */
    +  private[rest] def filterSystemEnvironment(env: Map[String, String]): 
Map[String, String] = {
    +    env.filterKeys { k =>
    +      // SPARK_HOME is filtered out because it is usually wrong on the 
remote machine (SPARK-12345)
    +      (k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED" && k != 
"SPARK_HOME") ||
    +        k.startsWith("MESOS_")
    --- End diff --
    
    Will this may break Mesos when using [the mesos 
bundle](https://github.com/apache/spark/blob/master/conf/spark-env.sh.template#L33)?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to