weimingdiit opened a new pull request, #2131: URL: https://github.com/apache/auron/pull/2131
# Which issue does this PR close? Closes #https://github.com/apache/auron/issues/2130 # Rationale for this change To improve compatibility with Spark SQL date functions, we should implement `weekofyear()` with Spark-compatible semantics. Expected behavior Function name: `weekofyear(expr)` Return type: `INT` Week semantics: - A week starts on Monday - Week 1 is the first week of the year with more than 3 days - This matches Spark’s ISO-style week numbering behavior Examples: - `weekofyear('2009-07-30')` -> `31` - `weekofyear('2016-01-01')` -> `53` - `weekofyear('2017-01-01')` -> `52` Supports: `DATE`, `TIMESTAMP`, and compatible string/date inputs consistent with existing date extraction functions Additional expectations: - Null-safe: returns `NULL` if input is `NULL` - Array and scalar inputs: consistent with existing native date extraction functions - Cross-year boundary behavior should match Spark semantics exactly # What changes are included in this PR? This PR adds native support for the `weekofyear()` function with Spark-compatible semantics. The following changes are included: - Added native implementation of `spark_weekofyear()` in the expression layer - Added `WeekOfYear` expression support in `NativeConverters` for proper Spark -> native translation - Registered `Spark_WeekOfYear` in native function dispatch - Added unit tests to verify correctness for: - normal date inputs - cross-year boundary cases - Spark-compatible ISO week numbering semantics - null input handling # Are there any user-facing changes? NO. # How was this patch tested? - Added and ran targeted Rust unit tests for `spark_weekofyear()` - Verified expected results for representative Spark-compatible cases such as: - `weekofyear('2009-07-30') = 31` - `weekofyear('2016-01-01') = 53` - `weekofyear('2017-01-01') = 52` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
