MaxGekk opened a new pull request #34215:
URL: https://github.com/apache/spark/pull/34215
### What changes were proposed in this pull request?
In the PR, I propose new test to check:
1. CREATE TABLE with ANSI interval columns
2. INSERT INTO the table ANSI interval values
3. SELECT the table with ANSI interval columns
Since Hive Metastore/Parquet Serde doesn't support interval types natively,
Spark fallbacks to its specific format while saving the schema to Hive external
catalog, and outputs the warning:
```
20:10:52.797 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog:
Could not persist `default`.`tbl_with_ansi_intervals` in a Hive compatible way.
Persisting it into Hive metastore in Spark SQL specific format.
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.IllegalArgumentException: Error: type expected at the position 0 of
'interval year to month:interval day to second' but 'interval year to month' is
found.
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:869)
```
### Why are the changes needed?
To improve test coverage.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
By running new test:
```
$ ./build/sbt -Phive-2.3 "test:testOnly *HiveParquetSuite"
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]