Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r201887513
--- Diff:
core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala ---
@@ -128,6 +128,18 @@ class JavaSparkContext(val sc: SparkContext)
/** Default min number of partitions for Hadoop RDDs when not given by
user */
def defaultMinPartitions: java.lang.Integer = sc.defaultMinPartitions
+ /**
+ * Total number of CPU cores of all executors registered in the cluster
at the moment.
+ * The number reflects current status of the cluster and can change in
the future.
+ */
+ def numCores: java.lang.Integer = sc.numCores
--- End diff --
ditto for `@since 2.4.0`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]