MaxGekk opened a new pull request #32209:
URL: https://github.com/apache/spark/pull/32209


   ### What changes were proposed in this pull request?
   Parse the year-month interval literals like `INTERVAL '1-1' YEAR TO MONTH` 
to values of `YearMonthIntervalType`, and day-time interval literals to 
`DayTimeIntervalType` values. Currently, Spark SQL supports:
   - DAY TO HOUR
   - DAY TO MINUTE
   - DAY TO SECOND
   - HOUR TO MINUTE
   - HOUR TO SECOND
   - MINUTE TO SECOND
   All such interval literals are converted to `DayTimeIntervalType` while 
loosing info about `from` and `to` units.
   
   Note: new behavior is under the SQL config 
`spark.sql.legacy.interval.enabled` which is `false` by default. When the 
config is set to `true`, the interval literals are parsed to 
`CaledarIntervalType` values.
   
   Closes #32176
   
   ### Why are the changes needed?
   To conform the ANSI SQL standard which assumes conversions of interval 
literals to year-month or day-time interval but not to mixed interval type like 
Catalyst's `CalendarIntervalType`.
   
   ### Does this PR introduce _any_ user-facing change?
   Yes.
   
   ### How was this patch tested?
   1. By running the affected test suites:
   ```
   $ ./build/sbt "test:testOnly *.ExpressionParserSuite"
   $ SPARK_GENERATE_GOLDEN_FILES=1 build/sbt "sql/testOnly *SQLQueryTestSuite 
-- -z interval.sql"
   $ SPARK_GENERATE_GOLDEN_FILES=1 build/sbt "sql/testOnly *SQLQueryTestSuite 
-- -z create_view.sql"
   ```
   2. PostgresSQL tests are executed with `spark.sql.legacy.interval.enabled` 
is set to `true` to keep compatibility with PostgreSQL output.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to