MaxGekk opened a new pull request #34201:
URL: https://github.com/apache/spark/pull/34201


   ### What changes were proposed in this pull request?
   In the PR, I propose to add new test which checks saving/loading of ANSI 
intervals as columns of a dataframe to/from a table using Hive External catalog 
and the Parquet datasource.
   
   Since Hive Metastore/Serde doesn't support interval types natively, Spark 
fallbacks to Spark specific format for schema w/ ANSI intervals. And it outputs 
the warning:
   ```
   23:35:46.289 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: 
Could not persist `default`.`tbl_ansi_intervals` in a Hive compatible way. 
Persisting it into Hive metastore in Spark SQL specific format.
   org.apache.hadoop.hive.ql.metadata.HiveException: 
java.lang.IllegalArgumentException: Error: type expected at the position 0 of 
'interval year to month:interval day to second' but 'interval year to month' is 
found.
        at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:869)
   ``` 
   
   ### Why are the changes needed?
   To improve test coverage.
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   ### How was this patch tested?
   By running new test:
   ```
   $ ./build/sbt -Phive-2.3 -Phive-thriftserver "test:testOnly 
*HiveParquetSourceSuite"
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to