n3nash commented on a change in pull request #1288: [HUDI-117] Close file 
handle before throwing an exception due to append…
URL: https://github.com/apache/incubator-hudi/pull/1288#discussion_r372190443
 
 

 ##########
 File path: 
hudi-common/src/main/java/org/apache/hudi/common/table/log/HoodieLogFormatWriter.java
 ##########
 @@ -256,7 +280,22 @@ private void handleAppendExceptionOrRecoverLease(Path 
path, RemoteException e)
         throw new HoodieException(e);
       }
     } else {
-      throw new HoodieIOException("Failed to open an append stream ", e);
+      // When fs.append() has failed and an exception is thrown, by closing 
the output stream
+      // we shall force hdfs to release the lease on the log file. When Spark 
driver retries this task (with
+      // new attemptId, say taskId.1) it will be able to acquire lease on the 
log file (as output stream was
+      // closed properly by taskId.0).
+      //
+      // If close() call were to fail throwing an exception, our best bet is 
to rollover to a new log file.
+      try {
+        close();
+        // output stream has been successfully closed and lease on the log 
file has been released.
+        throw new HoodieIOException("Failed to append to the output stream ", 
e);
 
 Review comment:
   Can you add a comment here why we are throwing an exception here

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to