AngersZhuuuu commented on PR #52919:
URL: https://github.com/apache/spark/pull/52919#issuecomment-3569045477

   > Thank you, @AngersZhuuuu and all.
   > 
   > 1. I also agree with the above review comments. The new configuration and 
its default value (`120s`) looks reasonable to me for Apache Spark 4.2.0.
   > 2. Since `INFINITE_TIMEOUT` seems unused after this this PR, shall we 
remove it because the comment becomes misleading (`Infinite timeout is used 
internally, so there's no timeout configuration property that controls 
it.`)?https://github.com/apache/spark/blob/92c948f4137686e3a566a58f6e671bc0c4a9cce5/core/src/main/scala/org/apache/spark/util/RpcUtils.scala#L58-L64
   > 
   > ```
   > $ git grep INFINITE_TIMEOUT
   > core/src/main/scala/org/apache/spark/util/RpcUtils.scala:  val 
INFINITE_TIMEOUT = new RpcTimeout(Long.MaxValue.nanos, "infinite")
   > ```
   
   Remove INFINITE_TIMEOUT done


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to