Hi Anil,

Here's a pointer to Flink's end-2-end test that's checking the integration
with schema registry [1].
It was recently updated so I hope it works the same way in Flink 1.9.

Best,
Fabian

[1]
https://github.com/apache/flink/blob/master/flink-end-to-end-tests/flink-confluent-schema-registry/src/main/java/org/apache/flink/schema/registry/test/TestAvroConsumerConfluent.java

Am Sa., 18. Apr. 2020 um 19:17 Uhr schrieb Anil K <sendto.ani...@gmail.com>:

> Hi,
>
> What is the best way to use Confluent SchemaRegistry with
> FlinkKafkaProducer?
>
> What I have right now is as follows.
>
> SerializationSchema<GenericRecord> serializationSchema =
>     ConfluentRegistryAvroSerializationSchema.forGeneric(topic, schema, 
> schemaRegistryUrl);
>
> FlinkKafkaProducer<GenericRecord> kafkaProducer =
>     new FlinkKafkaProducer<>(topic, serializationSchema, kafkaConfig);
> outputStream.addSink(producer);
>
> FlinkKafkaProducer with is SerializationSchema now depricated.
>
> I am using flink 1.9.
>
> How to use FlinkKafkaProducer with KafkaSerializationSchema with 
> ConfluentSchemaRegsitry?
>
> Is there some reference/documentation i could use?
>
> Thanks , Anil.
>
>

Reply via email to