MaxGekk opened a new pull request #25986: [SPARK-29311][SQL] Return seconds with fraction from `date_part()` and `extract` URL: https://github.com/apache/spark/pull/25986 ### What changes were proposed in this pull request? Added new expression `SecondWithFraction` which produce the `seconds` part of timestamps/dates with fractional part containing microseconds. This expression is used only in the `DatePart` expression. As the result, `date_part()` and `extract` returns seconds and microseconds as the fractional part of the seconds part when `field` is `SECOND` (or synonyms). ### Why are the changes needed? The `date_part()` and `extract` were added to maintain feature parity with PostgreSQL which has different behavior for the `SECOND` value of the `field` parameter. The fix is needed to behave in the same way. Here is PostgreSQL's output: ```sql # SELECT date_part('SECONDS', timestamp'2019-10-01 00:00:01.000001'); date_part ----------- 1.000001 (1 row) ``` ### Does this PR introduce any user-facing change? Yes, type of `date_part('SECOND', ...)` is changed from `INT` to `DECIMAL(8, 6)`. Before: ```sql spark-sql> SELECT date_part('SECONDS', '2019-10-01 00:00:01.000001'); 1 ``` After: ```sql spark-sql> SELECT date_part('SECONDS', '2019-10-01 00:00:01.000001'); 1.000001 ``` ### How was this patch tested? - Added new tests to `DateExpressionSuite` for the `SecondWithFraction` expression - Regenerated results of `date_part.sql`, `extract.sql` and `timestamp.sql` - Updated results of `ExtractBenchmark`
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
