MaxGekk opened a new pull request #26055: [WIP][SPARK-29368][SQL][TEST] Port interval.sql URL: https://github.com/apache/spark/pull/26055 ### What changes were proposed in this pull request? This PR is to port interval.sql from PostgreSQL regression tests: https://raw.githubusercontent.com/postgres/postgres/REL_12_STABLE/src/test/regress/sql/interval.sql The expected results can be found in the link: https://github.com/postgres/postgres/blob/REL_12_STABLE/src/test/regress/expected/interval.out When porting the test cases, found four PostgreSQL specific features that do not exist in Spark SQL: - [SPARK-29369] Accept strings without `interval` prefix in casting to intervals - [SPARK-29370] Interval strings without explicit unit markings - [SPARK-29371] Support interval field values with fractional parts - [SPARK-29382] Support the `INTERVAL` type by Parquet datasource - [SPARK-29383] Support the optional prefix `@` in interval strings - [SPARK-29384] Support `ago` in interval strings - [SPARK-29385] Make `INTERVAL` values comparable - [SPARK-29386] Copy data between a file and a table - [SPARK-29387] Support `*` and `\` operators for intervals - [SPARK-29388] Construct intervals from the `millenniums`, `centuries` or `decades` units - [SPARK-29389] Support synonyms for interval units - [SPARK-29390] Add the justify_days(), justify_hours() and justify_interval() functions - [SPARK-29391] Default year-month units ### Why are the changes needed? To improve the test coverage, see https://issues.apache.org/jira/browse/SPARK-27763 ### Does this PR introduce any user-facing change? No ### How was this patch tested? By manually comparing Spark results with PostgreSQL
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
