It's rather hard to help if we don't know the format in which the
records are serialized. There is a significant difference if you use a
schema registry or not. All schema registries known to me prepend the
actual data with some kind of magic byte and an identifier of the
schema. Therefore if we do not know to expect that we cannot properly
deserialize the record.

Nevertheless I would not say the problem has something to do with schema
registry. If I understand you correctly some records can be
deserialized. If they were produced with the schema registry type of
serialization all would fail.

What I can recommend is to try to log/identify a record that cannot be
deserialized and check debug the AvroRowDeserializationSchema with it.

Best,

Dawid

On 06/06/2020 16:27, Ramana Uppala wrote:
> We are using AvroRowDeserializationSchema with Kafka Table source to 
> deserialize the messages. Application failed with "Failed to deserialize Avro 
> record." for different messages it seems.
>
> Caused by: org.apache.avro.AvroRuntimeException: Malformed data. Length is 
> negative: -26
>
> Caused by: java.lang.ArrayIndexOutOfBoundsException: -49
>       at 
> org.apache.avro.io.parsing.Symbol$Alternative.getSymbol(Symbol.java:424) 
> ~[avro-1.8.2.jar:1.8.2]
>
> We are not sure what the serialization mechanism producer is using to publish 
> the messages at this time. But above errors are related to 
> https://issues.apache.org/jira/browse/FLINK-16048 ?
>
> Any suggestions on fixing above issues ? we are using Flink 1.10

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to