srowen commented on a change in pull request #25674: [SPARK-28340][CORE] Noisy 
exceptions when tasks are killed: "DiskBloc…
URL: https://github.com/apache/spark/pull/25674#discussion_r320768825
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/storage/DiskBlockObjectWriter.scala
 ##########
 @@ -219,6 +219,12 @@ private[spark] class DiskBlockObjectWriter(
         truncateStream = new FileOutputStream(file, true)
         truncateStream.getChannel.truncate(committedPosition)
       } catch {
+        // ClosedByInterruptException is an excepted exception when kill task,
+        // don't log the exception stack trace to avoid confusing users.
+        // See: SPARK-28340
+        case _: ClosedByInterruptException =>
+          logError("ClosedByInterruptException occurred while reverting 
partial writes to file"
+            + ", it maybe caused by job cancel.")
 
 Review comment:
   Is this second part meaningful? Maybe just `ClosedByInterruptException while 
reverting partial writes to  file" + file` (i.e. keep the file name)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to