[
https://issues.apache.org/jira/browse/FLINK-8983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16524974#comment-16524974
]
ASF GitHub Bot commented on FLINK-8983:
---------------------------------------
Github user tillrohrmann commented on the issue:
https://github.com/apache/flink/pull/6083
This is a good point @medcv. However, I think we should tackle adding a
`AvroSerializationConfluentSchema` as an orthogonal step. What about removing
it from this PR which covers the existing integration with Confluent's schema
registry. Additionally, we should open a JIRA issue to add a
`AvroSerializationConfluentSchema` to Flink. Once this has been added, we can
adapt this end-to-end test.
What do you think @medcv?
> End-to-end test: Confluent schema registry
> ------------------------------------------
>
> Key: FLINK-8983
> URL: https://issues.apache.org/jira/browse/FLINK-8983
> Project: Flink
> Issue Type: Sub-task
> Components: Kafka Connector, Tests
> Reporter: Till Rohrmann
> Assignee: Yazdan Shirvany
> Priority: Critical
> Labels: pull-request-available
>
> It would be good to add an end-to-end test which verifies that Flink is able
> to work together with the Confluent schema registry. In order to do that we
> have to setup a Kafka cluster and write a Flink job which reads from the
> Confluent schema registry producing an Avro type.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)