[GitHub] [hudi] li36909 commented on issue #2544: [SUPPORT]failed to read timestamp column in version 0.7.0 even when HIVE_SUPPORT_TIMESTAMP is enabled

2021-04-08 Thread GitBox


li36909 commented on issue #2544:
URL: https://github.com/apache/hudi/issues/2544#issuecomment-816365255


   @cdmikechen thank you for your explain. I use hudi 0.7 + spark 2.4.5 + hive 
3.1, and didn't test with hive 2.*, if possible please fix this issue for hive3 
also, thank you very much


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [hudi] li36909 commented on issue #2544: [SUPPORT]failed to read timestamp column in version 0.7.0 even when HIVE_SUPPORT_TIMESTAMP is enabled

2021-04-08 Thread GitBox


li36909 commented on issue #2544:
URL: https://github.com/apache/hudi/issues/2544#issuecomment-815849870


   @nsivabalan @cdmikechen 
   I fix and pass the test by a simple change like this:
   at 
hudi-hadoop-mr/src/main/java/org/apache/hudi/hadoop/utils/HoodieRealtimeRecordReaderUtils.java
   
   case LONG:
   177 | +if 
(schema.getLogicalType().getName().equals("timestamp-micros") && 
supportTimestamp) {
   178 | +  Timestamp timestamp = new Timestamp();
   179 | +  timestamp.setTimeInMillis((Long) value / 1000);
   180 | +  return new TimestampWritableV2(timestamp);
   181 | +}
   
   here convert long to timestamp then it's ok. and also pass supportTimestamp 
config at every reader and writter.
   do i miss any thing? thank you


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org