sarutak commented on pull request #34600: URL: https://github.com/apache/spark/pull/34600#issuecomment-968848478
> v1 Hive external catalog doesn't support ANSI intervals I think Hive external catalog supports ANSI intervals. In this case, the format of metadata is not Hive compatible but Spark specific. ``` spark-sql> CREATE TABLE tbl2(ym INTERVAL YEAR TO MONTH, dt INTERVAL DAY TO SECOND) using Parquet; 21/11/15 21:05:08 WARN HiveExternalCatalog: Hive incompatible types found: interval year to month, interval day to second. Persisting data source table `default`.`tbl2` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive. spark-sql> INSERT INTO tbl2 VALUES(INTERVAL '1-2' YEAR TO MONTH, INTERVAL '1 2:3:4' DAY TO SECOND); spark-sql> SELECT * FROM tbl2; 1-2 1 02:03:04.000000000 Time taken: 0.381 seconds, Fetched 1 row(s) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
