andygrove opened a new issue, #3086:
URL: https://github.com/apache/datafusion-comet/issues/3086
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `date_add_interval` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `DateAddInterval` expression adds a calendar interval to a date value
and returns the resulting date. It supports both ANSI-compliant mode for strict
error handling and optimized evaluation paths based on the interval's
microsecond component.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
date_column + interval_expression
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| start | DateType | The starting date to which the interval will be added |
| interval | CalendarIntervalType | The calendar interval containing months,
days, and microseconds |
| timeZoneId | Option[String] | Optional timezone identifier for timestamp
conversions (defaults to None) |
| ansiEnabled | Boolean | Flag indicating whether ANSI mode is enabled
(defaults to SQLConf setting) |
**Return Type:** `DateType` - Returns a date value representing the sum of
the input date and interval.
**Supported Data Types:**
- **Input**: `DateType` for the start date, `CalendarIntervalType` for the
interval
- **Output**: `DateType`
**Edge Cases:**
- **Null handling**: Null-intolerant - returns null if either input is null
- **ANSI mode**: Throws `IllegalArgumentException` for invalid date
arithmetic operations
- **Microsecond precision**: Automatically handles conversion between date
and timestamp representations based on interval precision
- **Timezone sensitivity**: Uses specified timezone for intermediate
timestamp calculations when microseconds are present
- **Overflow**: Delegates overflow handling to underlying `DateTimeUtils`
methods
**Examples:**
```sql
-- Add 1 month to a date
SELECT DATE '2023-01-15' + INTERVAL '1' MONTH;
-- Add complex interval to date
SELECT DATE '2023-01-15' + INTERVAL '2 months 10 days';
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(col("date_column") + expr("INTERVAL '1' MONTH"))
// Using interval literal
df.select(col("date_column") + lit(CalendarInterval.fromString("1 month 5
days")))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.DateAddInterval`
**Related:**
- `DatetimeSub` - Subtracts intervals from dates
- `TimestampAddInterval` - Adds intervals to timestamps
- `CalendarInterval` - Represents calendar intervals with months, days, and
microseconds
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]