Danny Chen created FLINK-19779:
----------------------------------

             Summary: Remove the "record_" field name prefix for Confluent Avro 
format deserialization
                 Key: FLINK-19779
                 URL: https://issues.apache.org/jira/browse/FLINK-19779
             Project: Flink
          Issue Type: Bug
          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
    Affects Versions: 1.12.0
            Reporter: Danny Chen
             Fix For: 1.12.0


Reported by Maciej BryƄski :

Problem is this is not compatible. I'm unable to read anything from Kafka using 
Confluent Registry. Example:
I have data in Kafka with following value schema:


{code:java}
{
  "type": "record",
  "name": "myrecord",
  "fields": [
    {
      "name": "f1",
      "type": "string"
    }
  ]
}
{code}

I'm creating table using this avro-confluent format:


{code:sql}
create table `test` (
        `f1` STRING
) WITH (
  'connector' = 'kafka', 
  'topic' = 'test', 
  'properties.bootstrap.servers' = 'localhost:9092', 
  'properties.group.id' = 'test1234', 
   'scan.startup.mode' = 'earliest-offset', 
  'format' = 'avro-confluent'
  'avro-confluent.schema-registry.url' = 'http://localhost:8081'
);
{code}

When trying to select data I'm getting error:


{code:noformat}
SELECT * FROM test;
[ERROR] Could not execute SQL statement. Reason:
org.apache.avro.AvroTypeException: Found myrecord, expecting record, missing 
required field record_f1
{code}




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to