Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/21589#discussion_r201968413
--- Diff: R/pkg/R/context.R ---
@@ -435,3 +435,31 @@ setCheckpointDir <- function(directory) {
sc <- getSparkContext()
invisible(callJMethod(sc, "setCheckpointDir",
suppressWarnings(normalizePath(directory))))
}
+
+#' Total number of CPU cores of all executors registered in the cluster at
the moment.
--- End diff --
I tested this in Yarn cluster now. In a higher level to user (regardless of
the details above), I think it's right to say cores assigned to the application
in general. Let's clarify this API is experimental.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]