Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19580#discussion_r147325260
  
    --- Diff: 
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
    @@ -267,6 +267,10 @@ private[spark] class ExecutorAllocationManager(
         (numRunningOrPendingTasks + tasksPerExecutor - 1) / tasksPerExecutor
       }
     
    +  private def totalRunningTasks(): Int = synchronized {
    --- End diff --
    
    I'm not sure why do we need to add a method which only used for unit test. 
If want to verify the behavior of `totalRunningTasks`, I think 
`maxNumExecutorsNeeded` can also be used indirectly for verification.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to