[ 
https://issues.apache.org/jira/browse/AVRO-2123?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17776341#comment-17776341
 ] 

ASF subversion and git services commented on AVRO-2123:
-------------------------------------------------------

Commit abe6d421e82fa1d7e2ff4ae19b8a73f2e75ca396 in avro's branch 
refs/heads/dependabot/maven/lang/java/org.apache.thrift-libthrift-0.19.0 from 
Oscar Westra van Holthe - Kind
[ https://gitbox.apache.org/repos/asf?p=avro.git;h=abe6d421e ]

AVRO-2123: Java duration logical type (#2520)

This adds a new `java.time.TemporalAmount` implementation, which
supports the Avro `duration` logical type (neither `java.time.Period`
nor `java.time.Duration` support it).

This type is then used in the conversion for the (new) logical type
implementation "duration".

Last, the logical type "uuid" is refactored to include validation.

> Logical types timestamp-micros, duration, decimal bug with SpecificRecord
> -------------------------------------------------------------------------
>
>                 Key: AVRO-2123
>                 URL: https://issues.apache.org/jira/browse/AVRO-2123
>             Project: Apache Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.8.2
>         Environment: All (Linux and Windows)
>            Reporter: Daniel Egloff
>            Priority: Critical
>              Labels: pull-request-available
>         Attachments: FixedDuration.java, 
> TestRecordWithLogicalTypesBroken.java, specific_types_broken.avsc
>
>   Original Estimate: 3h
>          Time Spent: 1h 40m
>  Remaining Estimate: 1h 20m
>
> The logical types date, time-millis, timestamp-millis are all handled 
> properly with a converter. Specifying a schema and generate code results in a 
> good constructor, which an use the proper types and serializatio e.g. from 
> and to Json works as expected. 
> However if  timestamp-micros, duration, decimal are used in the schema, there 
> is no converter being generated.
> Consider this schema
> {
>   "type" : "record",
>   "name" : "TestRecordWithLogicalTypesBroken",
>   "doc" : "Schema from Avro Project - the three last fields are not properly 
> converted",
>   "namespace" : "com.flink.ai.kafka.lab.types",
>   "fields" : [ {
>     "name" : "b",
>     "type" : "boolean"
>   }, {
>     "name" : "i32",
>     "type" : "int"
>   }, {
>     "name" : "i64",
>     "type" : "long"
>   }, {
>     "name" : "f32",
>     "type" : "float"
>   }, {
>     "name" : "f64",
>     "type" : "double"
>   }, {
>     "name" : "s",
>     "type" : [ "null", "string" ],
>     "default" : null
>   }, {
>     "name" : "d",
>     "type" : {
>       "type" : "int",
>       "logicalType" : "date"
>     }
>   }, {
>     "name" : "t",
>     "type" : {
>       "type" : "int",
>       "logicalType" : "time-millis"
>     }
>   }, {
>     "name" : "ts",
>     "type" : {
>       "type" : "long",
>       "logicalType" : "timestamp-millis"
>     }
>   }, {
>     "name" : "tsm",
>     "type" : {
>       "type" : "long",
>       "logicalType" : "timestamp-micros"
>     }
>   }, {
>     "name" : "dur",
>     "type" : {
>       "type" : "fixed",
>       "size" : 12,
>       "name" : "FixedDuration",
>       "logicalType" : "duration"
>     }
>   }, {
>     "name" : "dec",
>     "type" : {
>       "type" : "bytes",
>       "logicalType" : "decimal",
>       "precision" : 9,
>       "scale" : 2
>     }
>   }]
> }
> It results in the following generated code: First the conversions:
> private static final org.apache.avro.Conversion<?>[] conversions =
>       new org.apache.avro.Conversion<?>[] {
>       null,
>       null,
>       null,
>       null,
>       null,
>       null,
>       DATE_CONVERSION,
>       TIME_CONVERSION,
>       TIMESTAMP_CONVERSION,
>       null,    // should not be  null, should be a proper timestamp converter
>       null,    // should not be null, should be a proper duration converter
>       null,    // should not be null, should be a proper Decimal converter
>       null
>   };
> Then also the constructor uses the underlying basic types.
> public TestRecordWithLogicalTypesBroken(java.lang.Boolean b, 
> java.lang.Integer i32, java.lang.Long i64, java.lang.Float f32, 
> java.lang.Double f64, java.lang.CharSequence s, org.joda.time.LocalDate d, 
> org.joda.time.LocalTime t, org.joda.time.DateTime ts, java.lang.Long tsm, 
> com.flink.ai.kafka.lab.types.FixedDuration dur, java.nio.ByteBuffer dec) {...}
>  
> This is not in line with the documentation here:
> https://avro.apache.org/docs/1.8.2/spec.html#Timestamp+%28microsecond+precision%29
> https://avro.apache.org/docs/1.8.2/spec.html#Duration
> https://avro.apache.org/docs/1.8.2/spec.html#Decimal
> Do I somehow use the code generation wrong?
> I also observed that in the test there is a frozen class 
> https://github.com/apache/avro/blob/master/lang/java/avro/src/test/java/org/apache/avro/specific/TestRecordWithLogicalTypes.java
> If I use the code generation with 1.8.2 I cannot reproduce this. Also note 
> that it has a Decimal converter generated:
>   private final org.apache.avro.Conversion<?>[] conversions =
>       new org.apache.avro.Conversion<?>[] {
>       null,
>       null,
>       null,
>       null,
>       null,
>       null,
>       DATE_CONVERSION,
>       TIME_CONVERSION,
>       TIMESTAMP_CONVERSION,
>       DECIMAL_CONVERSION,
>       null
>   };
> I attach the schema that I used and the code that is generated from it.
> Thanks for looking into this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to