JingsongLi commented on a change in pull request #12403:
URL: https://github.com/apache/flink/pull/12403#discussion_r436433217



##########
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/util/HiveTypeUtil.java
##########
@@ -182,11 +182,17 @@ private static DataType 
toFlinkPrimitiveType(PrimitiveTypeInfo hiveType) {
 
                @Override
                public TypeInfo visit(CharType charType) {
-                       if (charType.getLength() > HiveChar.MAX_CHAR_LENGTH) {
-                               throw new CatalogException(
-                                               String.format("HiveCatalog 
doesn't support char type with length of '%d'. " +
-                                                                       "The 
maximum length is %d",
+                       // Flink treats string literal UDF parameters as CHAR. 
Such types may have precisions not supported by
+                       // Hive, e.g. CHAR(0). Promote it to STRING in such 
case if we're told not to check precision.
+                       if (charType.getLength() > HiveChar.MAX_CHAR_LENGTH || 
charType.getLength() < 1) {

Review comment:
       No, you can not have this assumption.
   Maybe there is a VarChar with zero precision in somewhere.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to