Hi all,
I am trying to create a Flink SQL pipeline that will consume from a kafka
topic that contains plain avro objects (no schema registry used).
As I can see in the docs, for plain avro, the schema (in flink sql context)
will be inferred from the table definition.

My problem is that the schema I use represents a very large, nested object,
and I have deserialization issue that I cannot understand. The error i get
is a very generic

```
    Caused by: org.apache.avro.InvalidNumberEncodingException: Invalid int
encoding
```

I manually created the table that should match the schema of the topic, but
stuck with debugging where the issue comes from (or exactly, which field is
causing problems).

I was wondering if anyone encountered a similar use case and have any tips
on how to better debug this.

Thanks,
Yarden

Reply via email to