Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21589#discussion_r202503533
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2336,6 +2336,18 @@ class SparkContext(config: SparkConf) extends 
Logging {
        */
       def defaultMinPartitions: Int = math.min(defaultParallelism, 2)
     
    +  /**
    +   * Total number of CPU cores of all executors registered in the cluster 
at the moment.
    +   * The number reflects current status of the cluster and can change in 
the future.
    +   */
    --- End diff --
    
    that means 
https://github.com/apache/spark/blob/39e2bad6a866d27c3ca594d15e574a1da3ee84cc/common/tags/src/main/java/org/apache/spark/annotation/Experimental.java


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to