Shixiong Zhu created SPARK-5072:
-----------------------------------

             Summary: Race condition in TaskSchedulerImpl.dagScheduler
                 Key: SPARK-5072
                 URL: https://issues.apache.org/jira/browse/SPARK-5072
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: Shixiong Zhu
            Priority: Minor


`TaskSchedulerImpl.dagScheduler` is set in 
DAGSchedulerEventProcessActor.preStart. But Akka doesn't guarantee which thread 
`Actor.preStart` will run in. Usually it will run in a different thread, 
without proper protection, `TaskSchedulerImpl.dagScheduler` may be null when 
it's used. The following test failure demonstrates it.

{noformat}
[info] - Scheduler does not always schedule tasks on the same workers *** 
FAILED *** (37 milliseconds)
[info]   java.lang.NullPointerException:
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImpl.executorAdded(TaskSchedulerImpl.scala:459)
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$1.apply(TaskSchedulerImpl.scala:226)
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$resourceOffers$1.apply(TaskSchedulerImpl.scala:221)
[info]   at scala.collection.immutable.List.foreach(List.scala:318)
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImpl.resourceOffers(TaskSchedulerImpl.scala:221)
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImplSuite$$anonfun$4$$anonfun$6.apply(TaskSchedulerImplSuite.scala:287)
[info]   at 
org.apache.spark.scheduler.TaskSchedulerImplSuite$$anonfun$4$$anonfun$6.apply(TaskSchedulerImplSuite.scala:284)
[info]   at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
[info]   at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
[info]   at scala.collection.immutable.Range.foreach(Range.scala:141)
[info]   at 
scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
{noformat}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to