[
https://issues.apache.org/jira/browse/AVRO-2471?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17168948#comment-17168948
]
Ryan Skraba commented on AVRO-2471:
-----------------------------------
[~dlipofsky] and [~j0xaf] -- thanks for the amazing samples, it made it much
easier to see what was going on for me!
We could always just find the conversions registered for all of the logical
types encountered, by asking the SpecificData known while generating the
record. This seems like it might be more reasonable than keying on the
"primitive" datum class (as a string) to find the conversion (as a string)
during generation.
I'll give this strategy a try to see what it looks like.
I suspect we're seeing other issues already even when the UNION isn't taken
into account because of the current implementation. You can see in the schema
of one of the test classes
[FieldTest.java|https://github.com/apache/avro/blob/8197803dceb89d53d6f5a3e9143f46c035d6558b/lang/java/tools/src/test/compiler/output-string/avro/examples/baseball/FieldTest.java#L19],
there are four LogicalTypes being used in the record, but only two conversions
being registered.
> Java maven plugin code generation doesn't add conversion for timestamp-micros
> -----------------------------------------------------------------------------
>
> Key: AVRO-2471
> URL: https://issues.apache.org/jira/browse/AVRO-2471
> Project: Apache Avro
> Issue Type: Bug
> Components: java
> Affects Versions: 1.9.0
> Reporter: Marek Tracz
> Assignee: Ryan Skraba
> Priority: Major
>
> Field in schema: (there is no single field with timestamp-millis logical type)
> {code:java}
> {
> "name": "RECORDING_TIME",
> "type": [
> "null",
> {
> "type": "long",
> "logicalType": "timestamp-micros"
> }
> ],
> "default": null
> }
> {code}
> Maven plugin configuration:
> {code:xml}
> <plugin>
> <groupId>org.apache.avro</groupId>
> <artifactId>avro-maven-plugin</artifactId>
> <version>1.9.0</version>
> <executions>
> <execution>
> <goals>
> <goal>schema</goal>
> </goals>
> <configuration>
> <stringType>String</stringType>
>
> <enableDecimalLogicalType>true</enableDecimalLogicalType>
>
> <sourceDirectory>${project.basedir}/src/main/resources/</sourceDirectory>
> </configuration>
> </execution>
> </executions>
> </plugin>
> {code}
> Part of the generated class:
> {code:java}
> private static SpecificData MODEL$ = new SpecificData();
> static {
> MODEL$.addLogicalTypeConversion(new
> org.apache.avro.data.TimeConversions.DateConversion());
> MODEL$.addLogicalTypeConversion(new
> org.apache.avro.data.TimeConversions.TimestampMillisConversion()); // <---
> this should be TimestampMicrosConversion
> MODEL$.addLogicalTypeConversion(new
> org.apache.avro.Conversions.DecimalConversion());
> }
> {code}
> For example this code:
> {code:java}
> Data data = Data.newBuilder()
> .setRECORDINGTIME(Instant.now())
> .build();
> {code}
> Fails during comparison:
> {noformat}
> org.apache.kafka.common.errors.SerializationException: Error serializing Avro
> message
> Caused by: org.apache.avro.AvroRuntimeException: Unknown datum type
> java.time.Instant: 2019-07-12T14:24:47.322Z
> at
> org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:887)
> at
> org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:420)
> at
> org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:850)
> at
> org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:249)
> at
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:142)
> at
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:98)
> at
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:195)
> at
> org.apache.avro.specific.SpecificDatumWriter.writeRecord(SpecificDatumWriter.java:83)
> at
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:130)
> at
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:98)
> at
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:195)
> at
> org.apache.avro.specific.SpecificDatumWriter.writeRecord(SpecificDatumWriter.java:83)
> at
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:130)
> at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:82)
> at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:72)
> at
> io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:92)
> at
> io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
> at
> org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65)
> at
> org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55)
> at
> org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:841)
> at
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803)
> at
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:690)
> {noformat}
> When manually changed to
> *org.apache.avro.data.TimeConversions.TimestampMicrosConversion* everything
> works properly.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)