[ 
https://issues.apache.org/jira/browse/AVRO-1891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16711160#comment-16711160
 ] 

ASF GitHub Bot commented on AVRO-1891:
--------------------------------------

scatrin commented on issue #118: AVRO-1891: Fix specific nested logical types
URL: https://github.com/apache/avro/pull/118#issuecomment-444798975
 
 
   @lazyval The workaround that I did applies to a setting with Kafka and 
Confluent's Schema Registry. The trick is to modify the 
serializer/deserializers (from Confluent) to explicitly register the needed 
conversions when creating the SpecificDatumWriter/Reader. (for instance 
[here](https://github.com/confluentinc/schema-registry/blob/ce5132bef1f6f49a7a174183a6867afbb8d564b8/avro-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaAvroSerializer.java#L97).
 SpecificDatumWriter has a 
[getter](https://github.com/apache/avro/blob/b4ede4b116b24b5308e8419504a73e02b7f7e406/lang/java/avro/src/main/java/org/apache/avro/specific/SpecificDatumReader.java#L63)
 for the SpecificData instance and SpecificData (via its superclass) has a 
[method](https://github.com/apache/avro/blob/bbea1cec3ed6ffae631b4c0d975639a098687ab4/lang/java/avro/src/main/java/org/apache/avro/generic/GenericData.java#L114)
 for registering logical type conversions. The cool thing is that the 
conversion class does not have to be part of Avro, you can build your own if 
you want to. Then just make sure that the modified classes are in the classpath 
when serializing/deserializing and that it is referenced properly in the Kafka 
settings. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Generated Java code fails with union containing logical type
> ------------------------------------------------------------
>
>                 Key: AVRO-1891
>                 URL: https://issues.apache.org/jira/browse/AVRO-1891
>             Project: Apache Avro
>          Issue Type: Bug
>          Components: java, logical types
>    Affects Versions: 1.8.1
>            Reporter: Ross Black
>            Priority: Blocker
>             Fix For: 1.8.3
>
>         Attachments: AVRO-1891.patch, AVRO-1891.yshi.1.patch, 
> AVRO-1891.yshi.2.patch, AVRO-1891.yshi.3.patch, AVRO-1891.yshi.4.patch
>
>
> Example schema:
> {code}
>     {
>       "type": "record",
>       "name": "RecordV1",
>       "namespace": "org.brasslock.event",
>       "fields": [
>         { "name": "first", "type": ["null", {"type": "long", 
> "logicalType":"timestamp-millis"}]}
>       ]
>     }
> {code}
> The avro compiler generates a field using the relevant joda class:
> {code}
>     public org.joda.time.DateTime first
> {code}
> Running the following code to perform encoding:
> {code}
>         final RecordV1 record = new 
> RecordV1(DateTime.parse("2016-07-29T10:15:30.00Z"));
>         final DatumWriter<RecordV1> datumWriter = new 
> SpecificDatumWriter<>(record.getSchema());
>         final ByteArrayOutputStream stream = new ByteArrayOutputStream(8192);
>         final BinaryEncoder encoder = 
> EncoderFactory.get().directBinaryEncoder(stream, null);
>         datumWriter.write(record, encoder);
>         encoder.flush();
>         final byte[] bytes = stream.toByteArray();
> {code}
> fails with the exception stacktrace:
> {code}
>  org.apache.avro.AvroRuntimeException: Unknown datum type 
> org.joda.time.DateTime: 2016-07-29T10:15:30.000Z
>     at org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:741)
>     at 
> org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:293)
>     at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:706)
>     at 
> org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:192)
>     at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:110)
>     at 
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:87)
>     at 
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:143)
>     at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:105)
>     at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
>     at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:60)
>     at 
> org.brasslock.avro.compiler.GeneratedRecordTest.shouldEncodeLogicalTypeInUnion(GeneratedRecordTest.java:82)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>     at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>     at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>     at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>     at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>     at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>     at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>     at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>     at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>     at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>     at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>     at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>     at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>     at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:117)
>     at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
>     at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:253)
>     at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:84)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> {code}
> The failure can be fixed by explicitly adding the relevant conversion(s) to 
> DatumWriter / SpecificData:
> {code}
>         final RecordV1 record = new 
> RecordV1(DateTime.parse("2007-12-03T10:15:30.00Z"));
>         final SpecificData specificData = new SpecificData();
>         specificData.addLogicalTypeConversion(new 
> TimeConversions.TimestampConversion());
>         final DatumWriter<RecordV1> datumWriter = new 
> SpecificDatumWriter<>(record.getSchema(), specificData);
>         final ByteArrayOutputStream stream = new 
> ByteArrayOutputStream(AvroUtil.DEFAULT_BUFFER_SIZE);
>         final BinaryEncoder encoder = 
> EncoderFactory.get().directBinaryEncoder(stream, null);
>         datumWriter.write(record, encoder);
>         encoder.flush();
>         final byte[] bytes = stream.toByteArray();
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to