Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15009#discussion_r87465101
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -735,7 +757,11 @@ object SparkSubmit {
         }
     
         try {
    -      mainMethod.invoke(null, childArgs.toArray)
    +      if (isSparkApp) {
    +        mainMethod.invoke(null, childArgs.toArray, childSparkConf, 
childSysProps)
    --- End diff --
    
    What is the difference between `childSparkConf` and `childSysProps`?
    
    SparkSubmit currently uses system properties because it has no other way to 
communicate Spark configs to the apps. But `SparkApp` provides that, so 
basically, there's no need to tell the child app about system properties, since 
SparkSubmit shouldn't be setting any when calling a `SparkApp`.
    
    From my previous comments, instead of `childSysProps` here you probably 
want some way to specify a custom environment (a.k.a. the `env` argument in 
`SparkLauncher(Map<String, String> env)`). That should not be mixed with system 
properties or Spark configuration.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to