[
https://issues.apache.org/jira/browse/FLINK-18026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17127854#comment-17127854
]
Shengkai Fang commented on FLINK-18026:
---------------------------------------
Currently we have cases: KafkaAvro2Kafka, KafkaCsv2Kafka and KafkaJson2Kafka.
The goal of KafkaAvro2Kafka is to test whether we can read from and write to
kafka using avro format.
The goal of KafkaCsv2Kafka is to test whether we can read from and write to
kafka using csv format.
The goal of KafkaJson2Kafka is to test whether we can read from and write to
kafka using json format and in this case we also add ARRAY datatype test.
Every case has two versions: the old connector property keys and new connector
property keys.
> E2E tests manually for Kafka connector and all formats
> ------------------------------------------------------
>
> Key: FLINK-18026
> URL: https://issues.apache.org/jira/browse/FLINK-18026
> Project: Flink
> Issue Type: Sub-task
> Components: Connectors / Kafka, Tests
> Affects Versions: 1.11.0
> Reporter: Danny Chen
> Assignee: Shengkai Fang
> Priority: Blocker
> Fix For: 1.11.0
>
>
> Use the SQL-CLI to:
> - test Kafka 2 Kafka use Avro
> - test Kafka 2 Kafka use JSON
> - test Kafka 2 Kafka use CSV
--
This message was sent by Atlassian Jira
(v8.3.4#803005)