cdmikechen commented on PR #3391:
URL: https://github.com/apache/hudi/pull/3391#issuecomment-1279699768

   > @cdmikechen thanks for your effort, first of all, i agree with your 
modification, so i give approve for this pr.
   > 
   > our product env is hive3, we solve this problem by modfiy 
HoodieRealtimeRecordReaderUtils.avroToArrayWritable
   > 
   > ```
   >       case LONG:
   >         if (supportTimestamp
   >             && schema.getLogicalType() != null
   >             && 
schema.getLogicalType().getName().equals("timestamp-micros")) {
   >           // TODO: Here we use hive interface to transform long to 
timestamp with local timezone.
   >           // Notice that hive support UTC and local timezone, but here we 
only support local
   >           // timezone.
   >           Timestamp timestamp = ParquetTimestampUtils.getTimestamp((Long) 
value, LogicalTypeAnnotation.TimeUnit.MICROS, true);
   >           return new TimestampWritableV2(timestamp);
   >         }
   >         return new LongWritable((Long) value);
   > ```
   
   Yes~ If we want to be compatible with both hive2 and hive3 and the previous 
version with type long, we have to do some independent compatibility work.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to