[
https://issues.apache.org/jira/browse/FLINK-19779?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17219633#comment-17219633
]
Maciej Bryński edited comment on FLINK-19779 at 10/23/20, 10:56 AM:
--------------------------------------------------------------------
No.
I've written data to Kafka manually.
I wasn't able to read it without this patch as Flink was expecting record_
prefix for all fields. With this patch I'm able to read data.
I'm using following command line to write data to Kafka
{code:java}
kafka-avro-console-producer --broker-list localhost:9092 --topic test \
--property
value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
--property schema.registry.url=http://localhost:8081{code}
Then I'm sending following record
{code:java}
{"f1":"xyz"}{code}
was (Author: maver1ck):
No.
I've written data to Kafka manually.
I wasn't able to read it without this patch as Flink was expecting record_
prefix for all fields. With this patch I'm able to read data.
I'm using following command line to write data to Kafka
{code:java}
kafka-avro-console-producer --broker-list localhost:9092 --topic test \
--property
value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
--property parse.key=true --property "key.separator=|" --property
key.schema='{"type":"record","name":"mykey","fields":[{"name":"key","type":"int"}]}'
--property schema.registry.url=http://localhost:8081{code}
Then I'm sending following record
{code:java}
{"key":123}|{"f1":"xyz"}{code}
> Remove the "record_" field name prefix for Confluent Avro format
> deserialization
> --------------------------------------------------------------------------------
>
> Key: FLINK-19779
> URL: https://issues.apache.org/jira/browse/FLINK-19779
> Project: Flink
> Issue Type: Bug
> Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
> Affects Versions: 1.12.0
> Reporter: Danny Chen
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.12.0
>
>
> Reported by Maciej Bryński :
> Problem is this is not compatible. I'm unable to read anything from Kafka
> using Confluent Registry. Example:
> I have data in Kafka with following value schema:
> {code:java}
> {
> "type": "record",
> "name": "myrecord",
> "fields": [
> {
> "name": "f1",
> "type": "string"
> }
> ]
> }
> {code}
> I'm creating table using this avro-confluent format:
> {code:sql}
> create table `test` (
> `f1` STRING
> ) WITH (
> 'connector' = 'kafka',
> 'topic' = 'test',
> 'properties.bootstrap.servers' = 'localhost:9092',
> 'properties.group.id' = 'test1234',
> 'scan.startup.mode' = 'earliest-offset',
> 'format' = 'avro-confluent'
> 'avro-confluent.schema-registry.url' = 'http://localhost:8081'
> );
> {code}
> When trying to select data I'm getting error:
> {code:noformat}
> SELECT * FROM test;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing
> required field record_f1
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)