Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r198590931
--- Diff: R/pkg/R/context.R ---
@@ -25,6 +25,22 @@ getMinPartitions <- function(sc, minPartitions) {
as.integer(minPartitions)
}
+#' Total number of CPU cores of all executors registered in the cluster at
the moment.
+#'
+#' @param sc SparkContext to use
+#' @return current number of cores in the cluster.
+numCores <- function(sc) {
+ callJMethod(sc, "numCores")
+}
+
+#' Total number of executors registered in the cluster at the moment.
+#'
+#' @param sc SparkContext to use
+#' @return current number of executors in the cluster.
+numExecutors <- function(sc) {
+ callJMethod(sc, "numExecutors")
+}
+
--- End diff --
Thank you for pointing me out the example of `spark.addFile`. I changed
`spark.numCores` and `spark.numExecutors` in the same way.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]