yaooqinn commented on a change in pull request #2212:
URL: https://github.com/apache/incubator-kyuubi/pull/2212#discussion_r835744955
##########
File path:
externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/schema/SchemaHelper.scala
##########
@@ -39,10 +39,13 @@ object SchemaHelper {
case _: DecimalType => TTypeId.DECIMAL_TYPE
case DateType => TTypeId.DATE_TYPE
case TimestampType => TTypeId.TIMESTAMP_TYPE
+ case tntz if tntz.simpleString.equals("timestamp_ntz") =>
TTypeId.TIMESTAMP_TYPE
Review comment:
Ideally,you are right. but there are some historical reasons. at the
very beginning,hive does not support timestamp with local timezone but spark
support,and they both have only one and be treated as same. now,they both have
2 types supported and the problem caused by old mapping stands out. I guess we
shall learn how spark deal with this mess.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]