Jason-liujc commented on issue #11684:
URL: https://github.com/apache/hudi/issues/11684#issuecomment-2253576047

   incremental query option:
   ```
   Map(hoodie.datasource.query.type -> incremental, 
hoodie.datasource.read.begin.instanttime -> 20240724070000000, 
hoodie.datasource.read.end.instanttime -> 20240725070000000, 
hoodie.read.timeline.holes.resolution.policy -> USE_TRANSITION_TIME)
   ```
   
   I don't really have a script at the moment but the steps are basically 
covered in the reproduce section:
   
   1. Create and continuously write to a COW table for say 10 days, some events 
updates, some events are newly inserted
   2. Have async cleaning job clean the table every day, keeping 1 or 2 latest 
file version
   3. Run incremental query with time range 10 days ago to 9 days ago
   4. Incremental query should fail with that error
   
   I'm under the assumption this behavior is expected from Hudi since the 
underlying files would be deleted by the cleaner, my main point here is just 
the error message should be a bit clearer if possible


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to