[ 
https://issues.apache.org/jira/browse/FLINK-14108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17329488#comment-17329488
 ] 

Flink Jira Bot commented on FLINK-14108:
----------------------------------------

This issue is assigned but has not received an update in 7 days so it has been 
labeled "stale-assigned". If you are still working on the issue, please give an 
update and remove the label. If you are no longer working on the issue, please 
unassign so someone else may work on it. In 7 days the issue will be 
automatically unassigned.

> Support for Confluent Kafka schema registry for Avro serialisation 
> -------------------------------------------------------------------
>
>                 Key: FLINK-14108
>                 URL: https://issues.apache.org/jira/browse/FLINK-14108
>             Project: Flink
>          Issue Type: New Feature
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.10.0
>            Reporter:  Lasse Nedergaard
>            Assignee:  Lasse Nedergaard
>            Priority: Minor
>              Labels: stale-assigned, stale-minor
>
> The current implementation in flink-avro-confluent-registry support 
> deserialization with schema lookup in Confluent Kafka schema registry. 
> I would like support for serialization as well, following the same structure 
> as deserialization. With the feature it would be possible to use Confluent 
> schema registry in Sink writing Avro to Kafka and at the same time register 
> the schema used.
> The test in TestAvroConsumerConfluent need to be updated together with the 
> comment as it indicate it use Confluent schema registry for write, but the 
> example code use SimpleStringSchema.
> We have a running version, that we would like to give back to the community.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to