klam-shop opened a new pull request, #109: URL: https://github.com/apache/flink-connector-kafka/pull/109
## What is the purpose of the change Allows writing to different Kafka topics based on the `topic` metadata column value in SQL, and updates the Table API's `KafkaDynamicSink` to accept a `List<String> topics` instead of `String topic`. The list acts as a whitelist of acceptable values for the `topic` metadata column: - If a single topic is provided, it is used by default for the target topic to produce to - If a list is provided, only that list of topics can be produced to - If no list is provided, any value is accepted Builds on the work in https://github.com/apache/flink/pull/16142 by @SteNicholas ## Brief change log - Adds `topic` as writable metadata - Updates Table API Kafka sink to accept `List<String> topics` instead of `String topic`, mirroring he source side. - Implements whitelist behaviour in `DynamicKafkaRecordSerializationSchema` ## Verifying this change - [x] Tests that the Sink Factory and related machinery works as expected - [x] Tests the various valid and invalid scenarios for `topic` metadata in `DynamicKafkaRecordSerializationSchema` ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): no - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: no - The serializers: yes - The runtime per-record code paths (performance sensitive): yes - Anything that affects deployment or recovery: no - The S3 file system connector: no ## Documentation - Does this pull request introduce a new feature? yes - If yes, how is the feature documented? updated documentation -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
