cshuo opened a new pull request, #18712:
URL: https://github.com/apache/hudi/pull/18712

   …TOR columns
   
   ### Describe the issue this Pull Request addresses
   
   Flink table reads could fail for Hudi tables created by Spark when the table 
schema contains VECTOR columns, even when the query only projects non-VECTOR 
fields. The failure came from Flink-side schema conversion and reader 
validation not distinguishing between the full table schema and the 
required/projected read schema.
   
   This PR improves Flink compatibility with Spark-written VECTOR tables by 
converting VECTOR schema metadata into Flink array element types while still 
rejecting actual VECTOR column reads until Flink reader support is implemented.
   
   ### Summary and Changelog
   
   - Adds VECTOR handling to `HoodieSchemaConverter`, mapping VECTOR element 
types to Flink arrays:
     - `FLOAT` -> `ARRAY<FLOAT>`
     - `DOUBLE` -> `ARRAY<DOUBLE>`
     - `INT8` -> `ARRAY<TINYINT>`
   - Adds Flink reader validation via 
`DataTypeUtils.validateReaderSupportedDataTypes`, rejecting reads only when 
projected fields include VECTOR columns.
   - Updates `HoodieTableSource` to validate the produced/read data type before 
building the source pipeline, and caches resolved table schema.
   
   ### Impact
   
   - **Functional impact**: Flink can build read pipelines and query non-VECTOR 
columns from Hudi tables that contain VECTOR columns; direct VECTOR column 
reads still fail with a clear validation error.
   
   ### Risk Level
   low
   
   <!-- Accepted values: none, low, medium or high. Other than `none`, explain 
the risk.
        If medium or high, explain what verification was done to mitigate the 
risks. -->
   
   ### Documentation Update
   
   <!-- Describe any necessary documentation update if there is any new 
feature, config, or user-facing change. If not, put "none".
   
   - The config description must be updated if new configs are added or the 
default value of the configs are changed.
   - Any new feature or user-facing change requires updating the Hudi website. 
Please follow the 
     [instruction](https://hudi.apache.org/contribute/developer-setup#website) 
to make changes to the website. -->
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Enough context is provided in the sections above
   - [ ] Adequate tests were added if applicable
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to