AngersZhuuuu commented on a change in pull request #25694: [SPARK-28982][SQL] 
Implementation Spark's own GetTypeInfoOperation
URL: https://github.com/apache/spark/pull/25694#discussion_r322651301
 
 

 ##########
 File path: 
sql/hive-thriftserver/v2.3.5/src/main/scala/org/apache/spark/sql/hive/thriftserver/ThriftserverShimUtils.scala
 ##########
 @@ -56,6 +56,12 @@ private[thriftserver] object ThriftserverShimUtils {
 
   private[thriftserver] def toJavaSQLType(s: String): Int = 
Type.getType(s).toJavaSQLType
 
+  private[thriftserver] def supportedType(): Seq[Type] = {
+    Array(Type.NULL_TYPE, Type.BOOLEAN_TYPE, Type.TINYINT_TYPE, 
Type.SMALLINT_TYPE, Type.INT_TYPE,
+      Type.BIGINT_TYPE, Type.FLOAT_TYPE, Type.DOUBLE_TYPE, Type.STRING_TYPE, 
Type.DATE_TYPE,
+      Type.TIMESTAMP_TYPE, Type.DECIMAL_TYPE, Type.BINARY_TYPE)
+  }
 
 Review comment:
   > But Spark will never return a USER_DEFINED type.
   > The current implementation of org.apache.spark.sql.types.UserDefinedType 
will return the underlying sqlType.simpleString as it's catalogString, so 
Thriftserver queries will return the underlying type in the schema.
   > Hence for USER_DEFINED (and UNIONTYPE) the argument is not that they 
wouldn't potentially work, but that Spark does not use them.
   
   Remove it and resolve conflicts.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to