Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r198219194
--- Diff: R/pkg/R/context.R ---
@@ -25,6 +25,22 @@ getMinPartitions <- function(sc, minPartitions) {
as.integer(minPartitions)
}
+#' Total number of CPU cores of all executors registered in the cluster at
the moment.
+#'
+#' @param sc SparkContext to use
+#' @return current number of cores in the cluster.
+numCores <- function(sc) {
+ callJMethod(sc, "numCores")
+}
+
+#' Total number of executors registered in the cluster at the moment.
+#'
+#' @param sc SparkContext to use
+#' @return current number of executors in the cluster.
+numExecutors <- function(sc) {
+ callJMethod(sc, "numExecutors")
+}
+
--- End diff --
actually, all sparkContext methods (ie. parameter has `sc`) are
internal/non-public/deprecated.
see `spark.addFile`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]