Github user mengxr commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22001#discussion_r208065660
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1597,6 +1597,15 @@ class SparkContext(config: SparkConf) extends 
Logging {
         }
       }
     
    +  /**
    +   * Get the number of currently active slots (total number of tasks can 
be launched currently).
    +   * Note that please don't cache the value returned by this method, 
because the number can change
    +   * due to add/remove executors.
    +   *
    +   * @return The number of tasks can be launched currently.
    +   */
    +  private[spark] def getNumSlots(): Int = schedulerBackend.getNumSlots()
    --- End diff --
    
    How about `maxConcurrentTasks`?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to