MaxGekk opened a new pull request #26010: [SPARK-29342][SQL] Make casting of string values to intervals case insensitive URL: https://github.com/apache/spark/pull/26010 ### What changes were proposed in this pull request? In the PR, I propose to pass the `Pattern.CASE_INSENSITIVE` flag while compiling interval patterns in `CalendarInterval`. This makes casting string values to intervals case insensitive and tolerant to case of the `interval`, `year(s)`, `month(s)`, `week(s)`, `day(s)`, `hour(s)`, `minute(s)`, `second(s)`, `millisecond(s)` and `microsecond(s)`. Also I removed the `fromCaseInsensitiveString` method from `CalendarInterval`, and replaced its usage by regular `fromString()`. ### Why are the changes needed? There are at least 2 reasons: - To maintain feature parity with PostgreSQL which is not sensitive to case: ```sql # select cast('10 Days' as INTERVAL); interval ---------- 10 days (1 row) ``` - Spark is tolerant to case of interval literals. Case insensitivity in casting should be convenient for Spark users. ```sql spark-sql> SELECT INTERVAL 1 YEAR 1 WEEK; interval 1 years 1 weeks ``` ### Does this PR introduce any user-facing change? Yes, current implementation produces `NULL` for `interval`, `year`, ... `microsecond` that are not in lower case. Before: ```sql spark-sql> select cast('INTERVAL 10 DAYS' as INTERVAL); NULL ``` After: ```sql spark-sql> select cast('interval 10 days' as INTERVAL); interval 1 weeks 3 days ``` ### How was this patch tested? - by new tests in `CalendarIntervalSuite.java` - new test in `CastSuite`
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
