Eric C Abis created KAFKA-7688:
----------------------------------
Summary: Allow byte array class for Decimal Logical Types to fix
Debezium Issues
Key: KAFKA-7688
URL: https://issues.apache.org/jira/browse/KAFKA-7688
Project: Kafka
Issue Type: Bug
Components: KafkaConnect
Affects Versions: 1.1.1
Reporter: Eric C Abis
Fix For: 1.1.1
Decimal Logical Type fields are failing with Kafka Connect sink tasks and
showing this error:
{code:java}
Invalid Java object for schema type BYTES: class [B for field: "null"{code}
There is an issue tracker for the problem here in the Confluent Schema Registry
tracker (it's all related):
[https://github.com/confluentinc/schema-registry/issues/833]
I've created a fix for this issue and tested and verified it in our CF4 cluster
here at Shutterstock.
Ultimately the issue boils down to the fact that in Avro, Decimal Logical types
store values as a Base64 encoded Byte Arrays for the default values, and
BigInteger Byte Arrays for the record values. I'd like to submit a PR that
changes the SCHEMA_TYPE_CLASSES hash map in
org.apache.kafka.connect.data.ConnectSchema to allow Byte Arrays for Decimal
fields.
Separately I have a similar change in{color:#333333}
[io.confluent.connect.avro.AvroData|https://github.com/TheGreatAbyss/schema-registry/pull/1/files#diff-ac149179f9760319ccc772695cb21364]
that I will submit a PR for as well.{color}
I reached out [to [email protected]|mailto:to%c2%[email protected]]
to ask for GitHub permissions but if there is somewhere else I need to reach
out to please let me know.
Thank You!
Eric
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)