[ 
https://issues.apache.org/jira/browse/FLINK-19779?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17219601#comment-17219601
 ] 

Jark Wu edited comment on FLINK-19779 at 10/23/20, 10:03 AM:
-------------------------------------------------------------

Thanks [~maver1ck] for the link and reporting this problem. Let's keep 
discussion in this JIRA issue, because the notification of pull request is 
rather rarely received. 

Regarding to the problem and the fix [~danny0405] provided, I'm still confused 
what's the root cause of {{org.apache.avro.AvroTypeException: Found myrecord, 
expecting record, missing required field record_f1}}. 

I guess you have written avro data using Flink SQL into Kafka days ago (maybe 
Flink 1.11.x?), and upgrade Flink to the latest master version, and using the 
new Flink SQL to consume the avro data in Kafka, the exception is thrown. Is 
that the "compatible" problem you mentioned? As the Avro schema are not 
compatible now. 


was (Author: jark):
Thanks [~maver1ck] for the link and reporting this problem. Let's keep 
discussion in this JIRA issue, because the notification of pull request is 
rather rarely received. 

Regarding to the problem and the fix [~danny0405] provided, I'm still confused 
what's the root cause of {{org.apache.avro.AvroTypeException: Found myrecord, 
expecting record, missing required field record_f1}}. 

I guess you have written avro data using Flink SQL into Kafka days ago, and 
upgrade Flink to the latest master version, and using the new Flink SQL to 
consume the avro data in Kafka, the exception is thrown. Is that the 
"compatible" problem you mentioned? As the Avro schema are not compatible now. 

> Remove the "record_" field name prefix for Confluent Avro format 
> deserialization
> --------------------------------------------------------------------------------
>
>                 Key: FLINK-19779
>                 URL: https://issues.apache.org/jira/browse/FLINK-19779
>             Project: Flink
>          Issue Type: Bug
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.12.0
>            Reporter: Danny Chen
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.12.0
>
>
> Reported by Maciej BryƄski :
> Problem is this is not compatible. I'm unable to read anything from Kafka 
> using Confluent Registry. Example:
> I have data in Kafka with following value schema:
> {code:java}
> {
>   "type": "record",
>   "name": "myrecord",
>   "fields": [
>     {
>       "name": "f1",
>       "type": "string"
>     }
>   ]
> }
> {code}
> I'm creating table using this avro-confluent format:
> {code:sql}
> create table `test` (
>       `f1` STRING
> ) WITH (
>   'connector' = 'kafka', 
>   'topic' = 'test', 
>   'properties.bootstrap.servers' = 'localhost:9092', 
>   'properties.group.id' = 'test1234', 
>    'scan.startup.mode' = 'earliest-offset', 
>   'format' = 'avro-confluent'
>   'avro-confluent.schema-registry.url' = 'http://localhost:8081'
> );
> {code}
> When trying to select data I'm getting error:
> {code:noformat}
> SELECT * FROM test;
> [ERROR] Could not execute SQL statement. Reason:
> org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing 
> required field record_f1
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to