HeartSaVioR commented on a change in pull request #24154: [SPARK-27210][SS] 
Cleanup incomplete output files in ManifestFileCommitProtocol if task is aborted
URL: https://github.com/apache/spark/pull/24154#discussion_r267604426
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/ManifestFileCommitProtocol.scala
 ##########
 @@ -114,7 +117,10 @@ class ManifestFileCommitProtocol(jobId: String, path: 
String)
   }
 
   override def abortTask(taskContext: TaskAttemptContext): Unit = {
-    // Do nothing
-    // TODO: we can also try delete the addedFiles as a best-effort cleanup.
+    // best effort cleanup of incomplete files
+    if (addedFiles.nonEmpty) {
+      val fs = new 
Path(addedFiles.head).getFileSystem(taskContext.getConfiguration)
+      addedFiles.foreach { file => fs.delete(new Path(file), false) }
 
 Review comment:
   I just followed how HadoopMapReduceCommitProtocol has been doing, but if we 
want to try deleting all files with ignoring exception (maybe with log message? 
not sure which log level would be), we can apply it to both places: 
HadoopMapReduceCommitProtocol and here.
   
   @zsxwing What do you think?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to