Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/21885#discussion_r206652789
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -2668,17 +2668,24 @@ object SparkContext extends Logging {
}
/**
- * The number of driver cores to use for execution in local mode, 0
otherwise.
+ * The number of cores available to the driver to use for tasks such as
IO with Netty
*/
- private[spark] def numDriverCores(master: String): Int = {
+ private[spark] def numDriverCores(master: String, conf: SparkConf =
null): Int = {
--- End diff --
You'll probably have to add a MiMa exclude for this change. It would be
super-safe to just overload the method instead, if there were a concern that
Java callers were calling this method directly. I don't think so though.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]