srowen commented on a change in pull request #25674: [SPARK-28340][CORE] Noisy
exceptions when tasks are killed: "DiskBloc…
URL: https://github.com/apache/spark/pull/25674#discussion_r320770218
##########
File path:
core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala
##########
@@ -349,8 +350,15 @@ final class ShuffleBlockFetcherIterator(
results.put(new SuccessFetchResult(blockId,
blockManager.blockManagerId,
buf.size(), buf, false))
} catch {
+ // If we see an exception, stop immediately.
+ case c: ClosedByInterruptException =>
+ // ClosedByInterruptException is an excepted exception when kill
task,
+ // don't log the exception stack trace to avoid confusing users.
+ // See: SPARK-28340
+ logError(s"Interrupt occurred while fetching local blocks")
Review comment:
Remove the interpolation. Should you name the exception here for clarity? In
both cases you could log the exception message in the string here instead, to
preserve some detail.
It'd be nice to avoid repeating the `results.put(...)` ; is this worth it?
```
case e: Exception =>
e match {
case ce: CloseByInterruptException => logError(...)
case ex: Exception => logError(...)
}
results.put(...)
return
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]