[ 
https://issues.apache.org/jira/browse/FLINK-18800?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17298034#comment-17298034
 ] 

Sören Henning commented on FLINK-18800:
---------------------------------------

I'm having the same issue, but I'm not sure if it's directly related to 
Avro/Confluent Schema Registry. Is there really no direct way to serialize a 
{{Tuple2<K, V>}} by specifying both a {{KafkaSerializationSchema<K>}} and a 
{{KafkaSerializationSchema<V>}}?

> Avro serialization schema doesn't support  Kafka key/value serialization
> ------------------------------------------------------------------------
>
>                 Key: FLINK-18800
>                 URL: https://issues.apache.org/jira/browse/FLINK-18800
>             Project: Flink
>          Issue Type: Task
>          Components: Connectors / Kafka, Formats (JSON, Avro, Parquet, ORC, 
> SequenceFile)
>    Affects Versions: 1.11.0, 1.11.1
>            Reporter: Mohammad Hossein Gerami
>            Priority: Critical
>
> {color:#ff8b00}AvroSerializationSchema{color} and 
> {color:#ff8b00}ConfluentRegistryAvroSerializationSchema{color} doesn't 
> support Kafka key/value serialization. I implemented a custom Avro 
> serialization schema for solving this problem. 
> please consensus to implement new class to support kafka key/value 
> serialization.
> for example in the Flink must implement a class like this:
> {code:java}
> public class KafkaAvroRegistrySchemaSerializationSchema extends 
> RegistryAvroSerializationSchema<GenericRecord> implements 
> KafkaSerializationSchema<GenericRecord>{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to