[
https://issues.apache.org/jira/browse/FLINK-8983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16491735#comment-16491735
]
ASF GitHub Bot commented on FLINK-8983:
---------------------------------------
GitHub user medcv opened a pull request:
https://github.com/apache/flink/pull/6083
[FLINK-8983] End-to-end test: Confluent schema registry
## Brief change log
Added an end-to-end test which verifies that Flink is able to work together
with the Confluent schema registry. In order to do that, this test sets up a
Kafka cluster and a Flink job which writes and reads from the Confluent schema
registry producing an Avro type.
## Does this pull request potentially affect one of the following parts:
- Dependencies (does it add or upgrade a dependency): (no)
- The public API, i.e., is any changed class annotated with
`@Public(Evolving)`: (no)
- The serializers: (no)
- The runtime per-record code paths (performance sensitive): (no)
- Anything that affects deployment or recovery: JobManager (and its
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
- The S3 file system connector: (no)
## Documentation
- Does this pull request introduce a new feature? (no)
- If yes, how is the feature documented? (not applicable / docs /
JavaDocs / not documented)
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/medcv/flink FLINK-8983
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/flink/pull/6083.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #6083
----
commit 8bcaee1a6d8b32e10888e46e608a1478b4a66e9b
Author: Yadan.JS <y_shirvany@...>
Date: 2018-05-21T02:31:26Z
[FLINK-8983] End-to-end test: Confluent schema registry
----
> End-to-end test: Confluent schema registry
> ------------------------------------------
>
> Key: FLINK-8983
> URL: https://issues.apache.org/jira/browse/FLINK-8983
> Project: Flink
> Issue Type: Sub-task
> Components: Kafka Connector, Tests
> Reporter: Till Rohrmann
> Assignee: Yazdan Shirvany
> Priority: Critical
>
> It would be good to add an end-to-end test which verifies that Flink is able
> to work together with the Confluent schema registry. In order to do that we
> have to setup a Kafka cluster and write a Flink job which reads from the
> Confluent schema registry producing an Avro type.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)