[
https://issues.apache.org/jira/browse/AVRO-1891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16591290#comment-16591290
]
ASF GitHub Bot commented on AVRO-1891:
--------------------------------------
scatrin commented on issue #329: Improved conversions handling + pluggable
conversions support [AVRO-1891, AVRO-2065]
URL: https://github.com/apache/avro/pull/329#issuecomment-415683420
@ivangreene As I understand it, the question is about using a generated Avro
type as the target type of a custom conversion? So whenever you use the logical
type "decimal" you would get that generated class as the type of your field?
The target type of the conversion needs to be in the classpath of the
Conversion implementation. And the Conversion implementation needs to be in the
classpath of the Maven plugin. (See the integration tests for an example of
this.) So it would be hard to do with both Avro types being generated at the
same time without doing some serious classloading magic in the Maven plugin.
However, you could have the project containing the conversion generate its
target type from an Avro schema, but that doesn't make much sense to me.
It should be pretty straightforward to add tests with new types of schemas
in the codegen-test project if you want to try something out.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> Generated Java code fails with union containing logical type
> ------------------------------------------------------------
>
> Key: AVRO-1891
> URL: https://issues.apache.org/jira/browse/AVRO-1891
> Project: Avro
> Issue Type: Bug
> Components: java, logical types
> Affects Versions: 1.8.1
> Reporter: Ross Black
> Priority: Blocker
> Fix For: 1.8.3
>
> Attachments: AVRO-1891.patch, AVRO-1891.yshi.1.patch,
> AVRO-1891.yshi.2.patch, AVRO-1891.yshi.3.patch, AVRO-1891.yshi.4.patch
>
>
> Example schema:
> {code}
> {
> "type": "record",
> "name": "RecordV1",
> "namespace": "org.brasslock.event",
> "fields": [
> { "name": "first", "type": ["null", {"type": "long",
> "logicalType":"timestamp-millis"}]}
> ]
> }
> {code}
> The avro compiler generates a field using the relevant joda class:
> {code}
> public org.joda.time.DateTime first
> {code}
> Running the following code to perform encoding:
> {code}
> final RecordV1 record = new
> RecordV1(DateTime.parse("2016-07-29T10:15:30.00Z"));
> final DatumWriter<RecordV1> datumWriter = new
> SpecificDatumWriter<>(record.getSchema());
> final ByteArrayOutputStream stream = new ByteArrayOutputStream(8192);
> final BinaryEncoder encoder =
> EncoderFactory.get().directBinaryEncoder(stream, null);
> datumWriter.write(record, encoder);
> encoder.flush();
> final byte[] bytes = stream.toByteArray();
> {code}
> fails with the exception stacktrace:
> {code}
> org.apache.avro.AvroRuntimeException: Unknown datum type
> org.joda.time.DateTime: 2016-07-29T10:15:30.000Z
> at org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:741)
> at
> org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:293)
> at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:706)
> at
> org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:192)
> at
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:110)
> at
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:87)
> at
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:143)
> at
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:105)
> at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73)
> at
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:60)
> at
> org.brasslock.avro.compiler.GeneratedRecordTest.shouldEncodeLogicalTypeInUnion(GeneratedRecordTest.java:82)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:117)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
> at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:253)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:84)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> {code}
> The failure can be fixed by explicitly adding the relevant conversion(s) to
> DatumWriter / SpecificData:
> {code}
> final RecordV1 record = new
> RecordV1(DateTime.parse("2007-12-03T10:15:30.00Z"));
> final SpecificData specificData = new SpecificData();
> specificData.addLogicalTypeConversion(new
> TimeConversions.TimestampConversion());
> final DatumWriter<RecordV1> datumWriter = new
> SpecificDatumWriter<>(record.getSchema(), specificData);
> final ByteArrayOutputStream stream = new
> ByteArrayOutputStream(AvroUtil.DEFAULT_BUFFER_SIZE);
> final BinaryEncoder encoder =
> EncoderFactory.get().directBinaryEncoder(stream, null);
> datumWriter.write(record, encoder);
> encoder.flush();
> final byte[] bytes = stream.toByteArray();
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)