Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1218#discussion_r15482474
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1253,7 +1253,8 @@ class SparkContext(config: SparkConf) extends Logging 
{
     
       /** Post the application start event */
       private def postApplicationStart() {
    -    listenerBus.post(SparkListenerApplicationStart(appName, startTime, 
sparkUser))
    +    listenerBus.post(SparkListenerApplicationStart(appName, 
taskScheduler.applicationId(),
    --- End diff --
    
    Have you verified the initialization order? This imposes a new ordering 
constraint where the `taskScheduler` must be initialized before posting 
application start, otherwise we throw an NPE. It would be good to at least add 
a comment to document this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to