notAprogrammer-0 commented on issue #12485:
URL: https://github.com/apache/hudi/issues/12485#issuecomment-2572398987
> Hi @notAprogrammer-0
>
> The issue is not reproducible with the local file system, which leads me
to suspect an HDFS-related problem. I propose examining the HDFS logs to
identify any client disconnections or other relevant errors, excluding
Hudi-specific issues.
>
> **Application start time:** 2025-01-02 14:56:51 **Application Killed
time:** 2025-01-02 20:16:44
>
> ```shell
> 2025-01-02 14:56:51 WARN NativeCodeLoader:60 - Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
> 2025-01-02 14:56:51 INFO EmbeddedTimelineService:67 - Starting Timeline
service !!
> 2025-01-02 14:56:51 WARN EmbeddedTimelineService:104 - Unable to find
driver bind address from spark config
> 2025-01-02 14:56:51 INFO FileSystemViewManager:232 - Creating View
Manager with storage type :MEMORY
> 2025-01-02 14:56:51 INFO FileSystemViewManager:244 - Creating in-memory
based Table View
> 2025-01-02 14:56:51 INFO log:193 - Logging initialized @1368ms to
org.apache.hudi.org.apache.jetty.util.log.Slf4jLog
>
> ..................
>
> 2025-01-02 20:16:43 INFO HoodieActiveTimeline:556 - Checking for file
exists ?file:/tmp/my_test/.hoodie/20250102201639581.clean.requested
> 2025-01-02 20:16:43 INFO HoodieActiveTimeline:564 - Create new file for
toInstant ?file:/tmp/my_test/.hoodie/20250102201639581.clean.inflight
> 2025-01-02 20:16:43 INFO CleanActionExecutor:133 - Using
cleanerParallelism: 3
> 2025-01-02 20:16:44 INFO HoodieActiveTimeline:129 - Loaded instants upto
: Option{val=[==>20250102201639581__clean__INFLIGHT]}
> 2025-01-02 20:16:44 INFO HoodieActiveTimeline:556 - Checking for file
exists ?file:/tmp/my_test/.hoodie/20250102201639581.clean.inflight
> 2025-01-02 20:16:44 INFO HoodieActiveTimeline:564 - Create new file for
toInstant ?file:/tmp/my_test/.hoodie/20250102201639581.clean
> 2025-01-02 20:16:44 INFO CleanActionExecutor:226 - Marked clean started
on 20250102201639581 as complete
> ```
Ok, I'll take care to check my HDFS logs, thanks for your help.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]