Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r201913817
--- Diff: R/pkg/R/context.R ---
@@ -435,3 +435,31 @@ setCheckpointDir <- function(directory) {
sc <- getSparkContext()
invisible(callJMethod(sc, "setCheckpointDir",
suppressWarnings(normalizePath(directory))))
}
+
+#' Total number of CPU cores of all executors registered in the cluster at
the moment.
--- End diff --
btw, `in this cluster` do we really mean cores allocated to the
"application" or "job"? it's not really in the cluster right? If I'm running
this app on Hadoop/YARN with 1000s of core, but only set aside 100 for this
app, which number am I getting from this API?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]