beliefer commented on a change in pull request #23560: [SPARK-26632][Spark
Core] Separate Thread Configurations of Driver and Executor
URL: https://github.com/apache/spark/pull/23560#discussion_r248528318
##########
File path:
core/src/main/scala/org/apache/spark/network/netty/SparkTransportConf.scala
##########
@@ -55,4 +58,31 @@ object SparkTransportConf {
}
})
}
+
+ /**
+ * Separate threads configuration of driver and executor
+ * @param conf the [[SparkConf]]
+ * @param module the module name
+ * @param server if true, it's for the serverThreads. Otherwise, it's for
the clientThreads.
+ * @param defaultNumThreads default number of threads
+ * @return
+ */
+ def getNumOfThreads(
+ conf: SparkConf,
+ module: String,
+ server: Boolean,
+ defaultNumThreads: Int): String = {
+
+ val isDriver = conf.get("spark.executor.id", "") ==
SparkContext.DRIVER_IDENTIFIER
Review comment:
I think you should consider the legacy version of DRIVER_IDENTIFIER,like:
`val executorId = conf.get("spark.executor.id", "")`
`val isDriver =`
`executorId == SparkContext.DRIVER_IDENTIFIER ||`
`executorId == SparkContext.LEGACY_DRIVER_IDENTIFIER`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]