Hi,
We read short Avro Binary encoded messages from Kafka and we don't send the 
schema with every message and we don't use any schema registry now. A client 
sends  a malformed message to Kafka.
Our Schema contains an array of records. Bytes in the message were interpreted 
as huge array block size and the DatumReader try to allocate huge Object[].

Is there any possibility to limit the size of an array in Avro like 
org.apache.avro.limits.string.maxLength property or total size of decoded 
object or some schema validation?
Is it possible to let decoder compare the rest size of the byte buffer with the 
size of array block?

Thanks

Milan Konzal

Reply via email to