[ 
https://issues.apache.org/jira/browse/SPARK-5375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

DjvuLee updated SPARK-5375:
---------------------------
    Description: 
In the ConnectionManager.scala file, there is three thread pool: 
handleMessageExecutor,  handleReadWriteExecutor, handleConnectExecutor.

such as:
private val handleMessageExecutor = new ThreadPoolExecutor(
          conf.getInt("spark.core.connection.handler.threads.min", 20),
         conf.getInt("spark.core.connection.handler.threads.max", 60),
         conf.getInt("spark.core.connection.handler.threads.keepalive", 60), 
         TimeUnit.SECONDS,
         new LinkedBlockingDeque[Runnable](),
         Utils.namedThreadFactory("handle-message-executor"))

Since we use a LinkedBlockingDeque, so the max thread parameter have no 
meaning. Every time I read the code, this  can lead to Confusing for me , Maybe 
we can add some comment in those place?

  was:
In the ConnectionManager.scala file, there is three thread pool: 
handleMessageExecutor,  handleReadWriteExecutor, handleConnectExecutor.

such as:
private val handleMessageExecutor = new ThreadPoolExecutor(
    conf.getInt("spark.core.connection.handler.threads.min", 20),
    conf.getInt("spark.core.connection.handler.threads.max", 60),
    conf.getInt("spark.core.connection.handler.threads.keepalive", 60), 
TimeUnit.SECONDS,
    new LinkedBlockingDeque[Runnable](),
    Utils.namedThreadFactory("handle-message-executor"))

Since we use a LinkedBlockingDeque, so the max thread parameter have no 
meaning. Every time I read the code, this  can lead to Confusing for me , Maybe 
we can add some comment in those place?


> Specify more clearly about the max thread meaning in the ConnectionManager
> --------------------------------------------------------------------------
>
>                 Key: SPARK-5375
>                 URL: https://issues.apache.org/jira/browse/SPARK-5375
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 1.1.0
>            Reporter: DjvuLee
>
> In the ConnectionManager.scala file, there is three thread pool: 
> handleMessageExecutor,  handleReadWriteExecutor, handleConnectExecutor.
> such as:
> private val handleMessageExecutor = new ThreadPoolExecutor(
>           conf.getInt("spark.core.connection.handler.threads.min", 20),
>          conf.getInt("spark.core.connection.handler.threads.max", 60),
>          conf.getInt("spark.core.connection.handler.threads.keepalive", 60), 
>          TimeUnit.SECONDS,
>          new LinkedBlockingDeque[Runnable](),
>          Utils.namedThreadFactory("handle-message-executor"))
> Since we use a LinkedBlockingDeque, so the max thread parameter have no 
> meaning. Every time I read the code, this  can lead to Confusing for me , 
> Maybe we can add some comment in those place?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to