Re: Utilizing Kafka headers in Flink Kafka connector

2022-10-13 Thread Shengkai Fang
hi. You can use SQL API to parse or write the header in the Kafka record[1] if you are using Flink SQL. Best, Shengkai [1] https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/kafka/#available-metadata Yaroslav Tkachenko 于2022年10月13日周四 02:21写道: > Hi, > > You can

Re: Utilizing Kafka headers in Flink Kafka connector

2022-10-12 Thread Yaroslav Tkachenko
Hi, You can implement a custom KafkaRecordDeserializationSchema (example https://docs.immerok.cloud/docs/cookbook/reading-apache-kafka-headers-with-apache-flink/#the-custom-deserializer) and just avoid emitting the record if the header value matches what you need. On Wed, Oct 12, 2022 at 11:04