Github user ehnalis commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6083#discussion_r30171835
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -371,6 +371,13 @@ class SparkContext(config: SparkConf) extends Logging 
with ExecutorAllocationCli
           throw new SparkException("An application name must be set in your 
configuration")
         }
     
    +    // Thread name has been set to "Driver" if user code ran by AM on a 
YARN cluster
    --- End diff --
    
    Checking in `SparkConf` would be an overhead I think, because it seems 
`set` would need to check for it. Initializing a `SparkContext` with an invalid 
`SparkConf` (with `yarn-cluster` setup) would be the best place to deal with 
it. As you said, checking for system property `spark.yarn.app.id` would be most 
feasible. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to