[ 
https://issues.apache.org/jira/browse/FLINK-8983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16523088#comment-16523088
 ] 

ASF GitHub Bot commented on FLINK-8983:
---------------------------------------

Github user medcv commented on the issue:

    https://github.com/apache/flink/pull/6083
  
    @tillrohrmann I made the changes and used 
`ConfluentRegistryAvroDeserializationSchema` for the Deserializations.  
    I still using `AvroSerializationConfluentSchema` as we need to compare each 
income `Event` with `Schema` before sending the data to Kafka and use the 
schema registry concept to have a full end2end test.
    We might need to `AvroSerializationConfluentSchema` also to the Flink dist. 


> End-to-end test: Confluent schema registry
> ------------------------------------------
>
>                 Key: FLINK-8983
>                 URL: https://issues.apache.org/jira/browse/FLINK-8983
>             Project: Flink
>          Issue Type: Sub-task
>          Components: Kafka Connector, Tests
>            Reporter: Till Rohrmann
>            Assignee: Yazdan Shirvany
>            Priority: Critical
>              Labels: pull-request-available
>
> It would be good to add an end-to-end test which verifies that Flink is able 
> to work together with the Confluent schema registry. In order to do that we 
> have to setup a Kafka cluster and write a Flink job which reads from the 
> Confluent schema registry producing an Avro type.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to