rahil-c opened a new pull request, #17731:
URL: https://github.com/apache/hudi/pull/17731

   ### Describe the issue this Pull Request addresses
   
   Feature: https://github.com/apache/hudi/issues/14127
   
   Goal: Write in Hudi using bulk-insert,  insert, update, and deletes with 
Lance files and read back the data with the Spark Datasource on a COW Table
   
   Exit Criteria: We should be able to construct a test that writes out 
multiple commits with spark and we can read back the same data. Testing should 
include time travel and incremental queries as well to ensure basic 
functionality works end to end.
   
   ### Summary and Changelog
   
   * Implement `InternalRowWriter` interface which is used by bulk insert, 
`HoodieInternalRowLanceWriter.java`
   * Implement `FileFormatUtils` in order to get upsert/delete functionality 
working correctly
   * Add more test cases in `TestLanceDataSource` for ensuring that we have 
full coverage of the above
   
   ### Impact
   
   None
   
   ### Risk Level
   
   Low
   
   ### Documentation Update
   
   None
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Enough context is provided in the sections above
   - [ ] Adequate tests were added if applicable


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to