weixiuli commented on code in PR #36141:
URL: https://github.com/apache/spark/pull/36141#discussion_r850287707
##########
core/src/main/scala/org/apache/spark/shuffle/ShuffleBlockPusher.scala:
##########
@@ -142,7 +142,7 @@ private[spark] class ShuffleBlockPusher(conf: SparkConf)
extends Logging {
* VisibleForTesting
*/
protected def submitTask(task: Runnable): Unit = {
- if (BLOCK_PUSHER_POOL != null) {
+ if (BLOCK_PUSHER_POOL != null && !BLOCK_PUSHER_POOL.isShutdown) {
Review Comment:
@otterc How about catching other exceptions like this?
```
private[shuffle] def tryPushUpToMax(): Unit = {
try {
pushUpToMax()
} catch {
case i: InterruptedException =>
Thread.currentThread().interrupt()
throw i
case e: FileNotFoundException =>
logWarning("The shuffle files got deleted when this
shuffle-block-push-thread " +
"was reading from them which could happen when the job finishes
and the driver " +
"instructs the executor to cleanup the shuffle. In this case, push
of the blocks " +
"belonging to this shuffle will stop.", e)
case other => throw other
}
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]