[
https://issues.apache.org/jira/browse/AVRO-3536?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Prathamesh updated AVRO-3536:
-----------------------------
Description:
The attached schema has top-level AVRO Union type. Whenever the client tries to
deserialize the message it fails at parsing field - opt_amount, with stack
trace -
{code:java}
Caused by: java.lang.ClassCastException: class java.nio.HeapByteBuffer cannot
be cast to class java.math.BigDecimal (java.nio.HeapByteBuffer and
java.math.BigDecimal are in module java.base of loader 'bootstrap')
at io.confluent.base.model.Test1.put(Test1.java:115)
at org.apache.avro.generic.GenericData.setField(GenericData.java:837)
at
org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:139)
at
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:248)
at
org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123)
at
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:188)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154)
at
io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:400)
{code}
The sample message you can try is -
{code:java}
{ "io.confluent.base.model.Test1": { "opt_amount": { "bytes": "10.2" } } }
{code}
was:
The attached schema has top-level AVRO Union type. Whenever the client tries to
deserialize the message it fails at parsing field - , with stack trace -
Caused by: java.lang.ClassCastException: class java.nio.HeapByteBuffer cannot
be cast to class java.math.BigDecimal (java.nio.HeapByteBuffer and
java.math.BigDecimal are in module java.base of loader 'bootstrap')
at io.confluent.base.model.Test1.put(Test1.java:115)
at org.apache.avro.generic.GenericData.setField(GenericData.java:837)
at
org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:139)
at
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:248)
at
org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123)
at
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:188)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154)
at
io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:400)
The sample message you can try is -
{ "io.confluent.base.model.Test1": \{"opt_amount": { "bytes": "10.2"}}}
> Union type not inheriting type conversions
> ------------------------------------------
>
> Key: AVRO-3536
> URL: https://issues.apache.org/jira/browse/AVRO-3536
> Project: Apache Avro
> Issue Type: Bug
> Components: java
> Affects Versions: 1.11.0
> Reporter: Prathamesh
> Priority: Major
> Attachments: Test1.avsc
>
>
> The attached schema has top-level AVRO Union type. Whenever the client tries
> to deserialize the message it fails at parsing field - opt_amount, with stack
> trace -
> {code:java}
> Caused by: java.lang.ClassCastException: class java.nio.HeapByteBuffer cannot
> be cast to class java.math.BigDecimal (java.nio.HeapByteBuffer and
> java.math.BigDecimal are in module java.base of loader 'bootstrap')
> at io.confluent.base.model.Test1.put(Test1.java:115)
> at org.apache.avro.generic.GenericData.setField(GenericData.java:837)
> at
> org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:139)
> at
> org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:248)
> at
> org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123)
> at
> org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
> at
> org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
> at
> org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:188)
> at
> org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
> at
> org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154)
> at
> io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:400)
> {code}
> The sample message you can try is -
> {code:java}
> { "io.confluent.base.model.Test1": { "opt_amount": { "bytes": "10.2" } } }
> {code}
--
This message was sent by Atlassian Jira
(v8.20.7#820007)