andygrove opened a new issue, #3096:
URL: https://github.com/apache/datafusion-comet/issues/3096
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `divide_dt_interval` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `DivideDTInterval` expression divides a day-time interval by a numeric
value, returning a new day-time interval. This operation performs division with
`HALF_UP` rounding mode and includes comprehensive overflow and divide-by-zero
checks to ensure safe arithmetic operations.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
day_time_interval / numeric_value
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| `interval` | `DayTimeIntervalType` | The day-time interval to be divided |
| `num` | `NumericType` | The numeric divisor (integral, decimal, or
fractional types) |
**Return Type:** `DayTimeIntervalType()` - Returns a day-time interval
representing the result of the division operation.
**Supported Data Types:**
- **Left operand (interval)**: `DayTimeIntervalType` only
- **Right operand (num)**: All numeric types including:
- `IntegralType` (Byte, Short, Int, Long)
- `DecimalType`
- `FractionalType` (Float, Double)
**Edge Cases:**
- **Null handling**: Expression is null-intolerant (`nullIntolerant = true`)
- returns null if either operand is null
- **Divide by zero**: Throws `QueryExecutionErrors` when divisor is zero
- **Overflow behavior**:
- Checks for `Long.MinValue / -1` overflow condition
- Throws `QueryExecutionErrors.overflowInIntegralDivideError` on overflow
- Uses `longValueExact()` for decimal operations to detect overflow
- **Precision**: All division operations use `HALF_UP` rounding mode for
consistent behavior
**Examples:**
```sql
-- Divide a day-time interval by an integer
SELECT INTERVAL '2 12:30:45.123' DAY TO SECOND / 2;
-- Result: INTERVAL '1 06:15:22.561500' DAY TO SECOND
-- Divide by decimal value
SELECT INTERVAL '5 00:00:00' DAY TO SECOND / 2.5;
-- Result: INTERVAL '2 00:00:00' DAY TO SECOND
```
```scala
// Example DataFrame API usage
import org.apache.spark.sql.functions._
// Divide interval column by numeric literal
df.select(col("day_time_interval") / lit(3))
// Divide interval by another numeric column
df.select(col("day_time_interval") / col("divisor"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.DivideDTInterval`
**Related:**
- `DivideYMInterval` - Division operation for year-month intervals
- `MultiplyDTInterval` - Multiplication of day-time intervals
- `IntervalDivide` - Base trait for interval division operations
- `BinaryExpression` - Parent class for binary operations
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]