wangyum commented on a change in pull request #25277: [SPARK-28637][SQL] 
Thriftserver support interval type
URL: https://github.com/apache/spark/pull/25277#discussion_r320329219
 
 

 ##########
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
 ##########
 @@ -308,7 +311,11 @@ private[hive] class SparkExecuteStatementOperation(
 object SparkExecuteStatementOperation {
   def getTableSchema(structType: StructType): TableSchema = {
     val schema = structType.map { field =>
-      val attrTypeString = if (field.dataType == NullType) "void" else 
field.dataType.catalogString
+      val attrTypeString = field.dataType match {
+        case NullType => "void"
+        case CalendarIntervalType => "string"
+        case other => other.catalogString
 
 Review comment:
   I think we do not need to return `string` for ` ArrayType | _: StructType | 
_: MapType | _: UserDefinedType[_]` because this will not change the actual 
return type. Just a workaround for the interval type:
   ```
   0: jdbc:hive2://localhost:10000> DESC SELECT interval '1' year '2' day AS i;
   +-----------+------------+----------+--+
   | col_name  | data_type  | comment  |
   +-----------+------------+----------+--+
   | i         | interval   | NULL     |
   +-----------+------------+----------+--+
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to