zjf2012 commented on a change in pull request #23560: [SPARK-26632][Core]
Separate Thread Configurations of Driver and Executor
URL: https://github.com/apache/spark/pull/23560#discussion_r264070140
##########
File path: core/src/main/scala/org/apache/spark/rpc/netty/Dispatcher.scala
##########
@@ -194,12 +194,26 @@ private[netty] class Dispatcher(nettyEnv: NettyRpcEnv,
numUsableCores: Int) exte
endpoints.containsKey(name)
}
+ def getNumOfThreads(conf: SparkConf): Int = {
+ val executorId = conf.get("spark.executor.id", "")
+ val isDriver = executorId == SparkContext.DRIVER_IDENTIFIER ||
+ executorId == SparkContext.LEGACY_DRIVER_IDENTIFIER
+ val side = if (isDriver) "driver" else "executor"
Review comment:
@vanzin , see my last comment for you. Besides, master and worker are
relative more decoupled than driver and executor. We have alternatives to make
them have different configurations. But for driver and executor, we cannot
easily to do that. For example, for spark.rpc.netty.dispatcher.numThreads, you
may have a optimized value, A, for driver. But it is usually not a optimized
value for executors.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]