cloud-fan commented on a change in pull request #23927: [SPARK-23433][CORE] 
avoid more than one active task set managers for a stage
URL: https://github.com/apache/spark/pull/23927#discussion_r261917390
 
 

 ##########
 File path: 
core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala
 ##########
 @@ -201,30 +201,10 @@ class TaskSchedulerImplSuite extends SparkFunSuite with 
LocalSparkContext with B
     // Even if one of the task sets has not-serializable tasks, the other task 
set should
     // still be processed without error
     taskScheduler.submitTasks(FakeTask.createTaskSet(1))
-    taskScheduler.submitTasks(taskSet)
     taskDescriptions = 
taskScheduler.resourceOffers(multiCoreWorkerOffers).flatten
     assert(taskDescriptions.map(_.executorId) === Seq("executor0"))
   }
 
-  test("refuse to schedule concurrent attempts for the same stage 
(SPARK-8103)") {
 
 Review comment:
   this part of code is reverted in this PR, so remove the test as well

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to