MehulBatra opened a new pull request, #1797:
URL: https://github.com/apache/fluss/pull/1797

   <!--
   *Thank you very much for contributing to Fluss - we are happy that you want 
to help us improve Fluss. To help the community review your contribution in the 
best possible way, please go through the checklist below, which will get the 
contribution into a shape in which it can be best reviewed.*
   
   ## Contribution Checklist
   
     - Make sure that the pull request corresponds to a [GitHub 
issue](https://github.com/apache/fluss/issues). Exceptions are made for typos 
in JavaDoc or documentation files, which need no issue.
   
     - Name the pull request in the format "[component] Title of the pull 
request", where *[component]* should be replaced by the name of the component 
being changed. Typically, this corresponds to the component label assigned to 
the issue (e.g., [kv], [log], [client], [flink]). Skip *[component]* if you are 
unsure about which is the best component.
   
     - Fill out the template below to describe the changes contributed by the 
pull request. That will give reviewers the context they need to do the review.
   
     - Make sure that the change passes the automated tests, i.e., `mvn clean 
verify` passes.
   
     - Each pull request should address only one issue, not mix up code from 
multiple issues.
   
   
   **(The sections below can be removed for hotfixes or typos)**
   -->
   
   ### Purpose
   
   <!-- Linking this pull request to the issue -->
   Linked issue: close #1727 #1559
   
   Support $lake for Iceberg Tables via flink fluss
   
   <!-- What is the purpose of the change -->
   
   ### Brief change log
   
   Added pluggable lake format support enabling Iceberg alongside existing 
Paimon integration. Core components now use reflection-based factory pattern 
for format-agnostic catalog and table operations.
   
   
   Flink Common Module:
   
   - LakeCatalog.java: Refactored to accept DataLakeFormat enum; removed 
hardcoded Paimon dependencies; added reflection-based catalog instantiation
   
   - LakeTableFactory.java: Made pluggable via DataLakeFormat parameter; uses 
reflection to create format-specific table factories
   
   - FlinkCatalog.java: Added lake format detection from table.datalake.format 
option; passes format to LakeCatalog
   
   - FlinkTableFactory.java: Reads connector property to determine format; 
passes context to LakeTableFactory
   
   
   Dependencies:
   
   fluss-flink-common/pom.xml
   
   - Added iceberg-flink-${flink.major.version} (provided scope, non-shaded)
   
   fluss-lake-iceberg/pom.xml
   
   - Overridden Jackson BOM to 2.18.3 for Iceberg compatibility
   - 
   - Added flink-connector-files and iceberg-flink dependencies (test scope)
   
   
   Tests
   
   
   
   ### Tests
   
   FlinkUnionReadLogTableITCase.java
   
   - Added testReadIcebergLakeTableDirectly() for $lake suffix queries
   
   ### API and Format
   
   <!-- Does this change affect API or storage format -->
   
   ### Documentation
   
   <!-- Does this change introduce a new feature -->
   Yes, need to add Iceberg read table directly support in the document. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to