leosanqing opened a new issue, #13537:
URL: https://github.com/apache/hudi/issues/13537

   
   **Describe the problem you faced**
   
   I've searched all ISSUEs, but not found same problem. But I've occurred 
several times in product env.
   Sorry for cannot give details because of company rules. 
   
   The error is  org.apache.hudi.exception.HoodieException:jva.io.IOException: 
unable to read commit metadata:
   
   at 
org.apache.hudi.sink.partitioner.profile.WriteProfiles.getCommitMetadata(WriteProfiles.java)
   
   org/apache/hudi/common/table/timeline/TimelineUtils.java:321(0.15 version)
   
   <img width="1470" height="603" alt="Image" 
src="https://github.com/user-attachments/assets/621f4921-e70d-44ad-ac15-924fefd10f6e";
 />
   
    And I try to figure out it. I see the instant.commit(compeleted) file, I 
found that the file is in-completed like this.
   
   <img width="1322" height="1260" alt="Image" 
src="https://github.com/user-attachments/assets/10bb18b4-5fb6-404f-be02-6a7cfbaf3ab9";
 />
   
   It's not completed Json file , so it throw exception of parsing json. 
   
   "I initially suspected two possible causes: 1. An issue with the JSON 
output. 2. An issue when writing to HDFS. I ruled out the first possibility 
when I reviewed the writing process and found that the class was directly 
converted to a JSON string and not modified afterward. Therefore, the only 
remaining possibility is that the file was truncated for some reason during the 
write to HDFS (meaning less data was written). Following the code logic, I 
located this specific 
place:"org.apache.hudi.common.table.timeline.HoodieActiveTimeline#createFileInMetaPath
   
   But I've no Idea what condition will lead to this.
   
   Maybe it not Hudi's Bug, it's HDFS's Bug. Can anyone give some suggestions? 
   
   **To Reproduce**
   
   Because it's an intermittent issue, not a consistently reproducible one, I'm 
unable to replicate it.
   this 
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version : 0.11
   
   * Flink version : 1.14
   
   * Hive version :
   
   * Hadoop version : 2.6.x; 3.2.1 
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) :
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   The error is org.apache.hudi.exception.HoodieException:jva.io.IOException: 
unable to read commit metadata:
   
   at 
org.apache.hudi.sink.partitioner.profile.WriteProfiles.getCommitMetadata(WriteProfiles.java)
   
   Jackson throw exception: cannot parse
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to