[
https://issues.apache.org/jira/browse/FLINK-22748?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17353035#comment-17353035
]
Timo Walther commented on FLINK-22748:
--------------------------------------
Jark is right. I'm referring to Flink's concept of writable metadata columns.
They allow to write out additional information that the connector can consume
per record. I just saw that the option \{{topic}} is already optional for
sources. So we can also make it optional for sinks and check the resolved
schema of the catalog table whether a it contains this metadata column.
Otherwise we throw an exception.
> Allow dynamic target topic selection in SQL Kafka sinks
> -------------------------------------------------------
>
> Key: FLINK-22748
> URL: https://issues.apache.org/jira/browse/FLINK-22748
> Project: Flink
> Issue Type: Improvement
> Components: Connectors / Kafka, Table SQL / Ecosystem
> Reporter: Timo Walther
> Priority: Major
> Labels: starter
>
> We should allow to write to different Kafka topics based on some column value
> in SQL.
> The existing implementation can be easily adapted for that. The "target
> topic" would be an additional persisted metadata column in SQL terms. All one
> need to do is to adapt
> DynamicKafkaSerializationSchema
> KafkaDynamicSink
> We should guard this dynamic behavior via a config option and make the topic
> option optional in this case.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)