umeshdangat opened a new pull request, #7396:
URL: https://github.com/apache/paimon/pull/7396

    flink-connector-kafka 4.x (for Flink 2.x) removed the legacy
     KafkaDeserializationSchema interface from the streaming-connectors package.
     The replacement is KafkaRecordDeserializationSchema from the new connector
     source package, with a Collector-based deserialize() API instead of the
     single-return-value pattern.
   
     Changes:
     - pom.xml: bump flink.version 1.20.1 -> 2.1.0,
       flink.connector.kafka.version 3.3.0-1.20 -> 4.0.1-2.0
     - KafkaDebeziumAvroDeserializationSchema,
       KafkaDebeziumJsonDeserializationSchema: implement
       KafkaRecordDeserializationSchema; change deserialize() to void with
       Collector<T> output parameter; remove isEndOfStream()
     - DataFormat, AbstractDataFormat, AbstractJsonDataFormat,
       DebeziumAvroDataFormat: swap KafkaDeserializationSchema ->
       KafkaRecordDeserializationSchema in method signatures
     - KafkaActionUtils: update buildKafkaSource() and 
getKafkaEarliestConsumer()
       signatures; rewrite KafkaConsumerWrapper.getRecords() to use 
ListCollector
     - PaimonMetadataApplier: replace flink-shaded-guava31 Sets.newHashSet() 
with
       standard java.util.EnumSet (fixes pre-existing compilation regression)
     - Test files: update DebeziumBsonRecordParserTest and
       KafkaDebeziumJsonDeserializationSchemaTest to use new Collector-based API
   
     ---
     Purpose
   
     Linked issue: relates to #5350, relates to #4442
   
     flink-connector-kafka 4.x 
(https://issues.apache.org/jira/browse/FLINK-36648, 
https://github.com/apache/flink-connector-kafka/pull/140) removed the legacy 
KafkaDeserializationSchema interface as part of the Flink 2.0 release. This PR 
updates paimon-flink-cdc to compile and run correctly against Flink 2.1 and 
flink-connector-kafka 4.0.1-2.0 by replacing all usages of the removed 
interface with KafkaRecordDeserializationSchema, which uses a 
Collector<T>-based deserialize() instead of the single-return-value pattern.
   
     This is a partial step toward full Flink 2.x support tracked in #5350. The 
scope is limited to paimon-flink-cdc and its Kafka CDC ingestion path.
   
     Tests
   
     - KafkaDebeziumJsonDeserializationSchemaTest#testDeserializeWithNonJsonKey
     - KafkaMetadataConverterTest (all cases)
     - DebeziumBsonRecordParserTest#extractInsertRecord
     - DebeziumBsonRecordParserTest#extractUpdateRecord
     - DebeziumBsonRecordParserTest#extractDeleteRecord
     - DebeziumBsonRecordParserTest#bsonConvertJsonTest
   
     API and Format
   
     No API or storage format changes.
   
     Documentation
   
     No new user-facing features introduced. No documentation changes required.
   
     Generative AI tooling
   
     Generated-by: Claude (claude-sonnet-4-5, Anthropic). The changes are 
mechanical API substitutions verified by compilation and unit tests. Per the 
https://www.apache.org/legal/generative-tooling.html: the tool's terms permit 
open source contribution use; the output contains no reproduced third-party 
copyrightable material; correctness was verified via mvn compile and the test 
cases listed above.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to