maver1ck commented on pull request #12919:
URL: https://github.com/apache/flink/pull/12919#issuecomment-715009216
@danny0405
Problem is that with this strategy I'm unable to read anything from Kafka
using Confluent Registry. Example:
I have data in Kafka with following value schema:
```
{
"type": "record",
"name": "myrecord",
"fields": [
{
"name": "f1",
"type": "string"
}
]
}
```
I'm creating table using this avro-confluent format:
```
create table `test` (
`f1` STRING
) WITH (
'connector' = 'kafka',
'topic' = 'test',
'properties.bootstrap.servers' = 'localhost:9092',
'properties.group.id' = 'test1234',
'scan.startup.mode' = 'earliest-offset',
'format' = 'avro-confluent'
'avro-confluent.schema-registry.url' = 'http://localhost:8081'
);
```
When trying to select data I'm getting error:
```
SELECT * FROM test;
[ERROR] Could not execute SQL statement. Reason:
org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing
required field record_f1
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]