zjf2012 commented on a change in pull request #23560: [SPARK-26632][Core]
Separate Thread Configurations of Driver and Executor
URL: https://github.com/apache/spark/pull/23560#discussion_r265388998
##########
File path: core/src/main/scala/org/apache/spark/rpc/netty/Dispatcher.scala
##########
@@ -194,12 +194,32 @@ private[netty] class Dispatcher(nettyEnv: NettyRpcEnv,
numUsableCores: Int) exte
endpoints.containsKey(name)
}
- /** Thread pool used for dispatching messages. */
- private val threadpool: ThreadPoolExecutor = {
+ def getNumOfThreads(conf: SparkConf): Int = {
val availableCores =
if (numUsableCores > 0) numUsableCores else
Runtime.getRuntime.availableProcessors()
- val numThreads = nettyEnv.conf.get(RPC_NETTY_DISPATCHER_NUM_THREADS)
+ // module configuration
+ val modNumThreads = nettyEnv.conf.get(RPC_NETTY_DISPATCHER_NUM_THREADS)
.getOrElse(math.max(2, availableCores))
+ // try to get specific threads configurations of driver and executor
+ // override module configurations if specified
+ val executorId = conf.get("spark.executor.id", "")
+ // neither driver nor executor if executor id is not set
Review comment:
you mean I should wrap the logic in one method for multiple invocation?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]