Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r202459283
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -2336,6 +2336,18 @@ class SparkContext(config: SparkConf) extends
Logging {
*/
def defaultMinPartitions: Int = math.min(defaultParallelism, 2)
+ /**
+ * Total number of CPU cores of all executors registered in the cluster
at the moment.
+ * The number reflects current status of the cluster and can change in
the future.
+ */
--- End diff --
> Let's at least leave a @note that this feature is experimental.
What does `experimental` mean for user? unstable? can be changed in the
future. When I as an user read the note, how should I change my app to take
into account it?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]