ShreyeshArangath opened a new issue, #2155:
URL: https://github.com/apache/auron/issues/2155
**Describe the bug**
<!--
A clear and concise description of what the bug is.
-->
Five date-part extraction functions in `NativeConverters.scala` use
`buildExtScalarFunction`, which does not pass the session timezone to the
native Rust implementation:
By contrast, `Hour`, `Minute`, `Second`, and `WeekOfYear` correctly use
`buildTimePartExt`, which passes `sessionLocalTimeZone` for `TimestampType`
inputs.
This inconsistency can cause incorrect results for timestamp inputs near
date boundaries in non-UTC timezones.
Affected functions:
- Year (Spark_Year) — not timezone-aware
- Month (Spark_Month) — not timezone-aware
- DayOfMonth (Spark_Day) — not timezone-aware
- DayOfWeek (Spark_DayOfWeek) — not timezone-aware
- Quarter (Spark_Quarter) — not timezone-aware
**To Reproduce**
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
-->
1. Set `spark.sql.session.timeZone` to `America/New_York`
2. Create a table with a timestamp column containing `2021-01-04 04:30:00
UTC`
(equivalent to `2021-01-03 23:30:00` in New York)
3. Run:
```sql
SELECT dayofmonth(ts), dayofweek(ts) FROM t1
```
**Expected behavior**
All date-part extraction functions should interpret timestamp inputs in the
session local timezone before extracting the date component, matching Spark's
behavior.
All 5 functions should use `buildTimePartExt` (or equivalent), and their
corresponding Rust implementations should accept and handle an optional
timezone argument.
**Screenshots**
<!--
If applicable, add screenshots to help explain your problem.
-->
**Additional context**
<!--
Add any other context about the problem here.
-->
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]