Github user dhruve commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19194#discussion_r141482741
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
    @@ -512,6 +535,9 @@ private[spark] class TaskSetManager(
               serializedTask)
           }
         } else {
    +      if (runningTasks >= maxConcurrentTasks) {
    +        logDebug("Already running max. no. of concurrent tasks.")
    --- End diff --
    
    I'll make the change for this and also update any comments to explain the 
behavior so far. Also I am not clear on the earlier reply as to what was the 
resolution for accounting the activeJobId. Do you still have any inputs ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to