kkrugler opened a new issue, #8136:
URL: https://github.com/apache/hudi/issues/8136

   **Describe the problem you faced**
   
   When using Flink to do an incremental query read from a table, using the 
Hudi 0.13.0 release and Flink 1.15, I get a `java.lang.NoSuchMethodError`.
   
   I believe the issue is that the new ParquetColumnarRowSplitReader added for 
Flink 1.16 is returning `ColumnarRowData`, but it should be returning 
`RowData`, the same as the other versions of this class (for Flink 
1.13/1.14/1.15).
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. `git clone https://github.com/kkrugler/flink-hudi-query-test`
   2. Edit the `pom.xml` file to set `<hudi.version>0.13.0</hudi.version>`.
   3. Run `mvn clean package`
   
   The `ExampleWorkflowTest.testHudiAndIncrementalQuery` test will fail.
   
   **Expected behavior**
   
   The tests should all pass.
   
   **Environment Description**
   
   * Hudi version : 0.13.0
   
   * Flink version : 1.15.1
   
   **Stacktrace**
   
   ```
   java.lang.NoSuchMethodError: 
org.apache.hudi.table.format.cow.vector.reader.ParquetColumnarRowSplitReader.nextRecord()Lorg/apache/flink/table/data/columnar/ColumnarRowData;
        at 
org.apache.hudi.table.format.ParquetSplitRecordIterator.next(ParquetSplitRecordIterator.java:50)
        at 
org.apache.hudi.table.format.ParquetSplitRecordIterator.next(ParquetSplitRecordIterator.java:32)
        at 
org.apache.hudi.table.format.mor.MergeOnReadInputFormat.nextRecord(MergeOnReadInputFormat.java:271)
        at 
org.apache.hudi.source.StreamReadOperator.consumeAsMiniBatch(StreamReadOperator.java:187)
        at 
org.apache.hudi.source.StreamReadOperator.processSplits(StreamReadOperator.java:166)
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to