bowenli86 commented on a change in pull request #8769: [FLINK-12875][hive] 
support converting input args of char, varchar, bytes, timestamp, date for Hive 
functions
URL: https://github.com/apache/flink/pull/8769#discussion_r294941682
 
 

 ##########
 File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/functions/hive/conversion/HiveInspectors.java
 ##########
 @@ -162,34 +220,64 @@ public static Object toFlinkObject(ObjectInspector 
inspector, Object data) {
 
                                return oi.preferWritable() ?
                                        oi.get(data) :
-                                       oi.getPrimitiveWritableObject(data);
+                                       oi.getPrimitiveJavaObject(data);
 
 Review comment:
   I read the source code and did some more testing. HiveSimpleUDF and 
HiveGenericUDF don't need writables as we explicitly requires primitive return 
type in both of them. I'm not fully sure about UDTF and UDAF as we don't have 
them yet, nor does tests for them. 
   
   I recommend keep them for now as they are not doing any harms. I've created 
"[FLINK-12891] evaluate converting hadoop/hive writable between Hive functions 
and Flink" to re-evaluate this issue once we have support for all hive 
functions in place in 1.9. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to