andygrove opened a new issue, #3101:
URL: https://github.com/apache/datafusion-comet/issues/3101
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `multiply_dt_interval` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `MultiplyDTInterval` expression multiplies a day-time interval by a
numeric value, scaling the duration proportionally. This is a binary expression
that performs null-safe multiplication between interval and numeric types with
proper overflow handling and rounding.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
-- SQL syntax
day_time_interval * numeric_value
-- Function form
INTERVAL '1 2:3:4.567' DAY TO SECOND * 2.5
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| `interval` | DayTimeIntervalType | The day-time interval to be multiplied
(left operand) |
| `num` | NumericType | The numeric multiplier value (right operand) |
**Return Type:** Returns `DayTimeIntervalType()` - a day-time interval
representing the scaled duration.
**Supported Data Types:**
- **Left operand**: `DayTimeIntervalType` only
- **Right operand**: All numeric types including:
- Integral types (Byte, Short, Int, Long)
- Fractional types (Float, Double)
- Decimal types with arbitrary precision
**Edge Cases:**
- **Null handling**: Null-intolerant expression - returns null if either
operand is null
- **Overflow behavior**:
- Integral multiplication throws `ArithmeticException` on overflow via
`Math.multiplyExact()`
- Decimal operations may throw `ArithmeticException` on scale conversion
- Fractional operations may produce `Infinity` or lose precision
- **Rounding**: Non-integral results are rounded to nearest microsecond
using `HALF_UP` mode
- **Zero multiplication**: Results in zero-duration interval
- **Negative multiplication**: Produces negative intervals (reverse
direction)
**Examples:**
```sql
-- Basic interval multiplication
SELECT INTERVAL '2 10:30:45' DAY TO SECOND * 3;
-- Result: INTERVAL '7 07:32:15' DAY TO SECOND
-- Fractional multiplication with rounding
SELECT INTERVAL '1 12:00:00' DAY TO SECOND * 1.5;
-- Result: INTERVAL '2 06:00:00' DAY TO SECOND
-- Negative multiplication
SELECT INTERVAL '5 08:30:00' DAY TO SECOND * -0.5;
-- Result: INTERVAL '-2 -12:-15:00' DAY TO SECOND
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(col("day_interval") * lit(2.5))
// With explicit casting if needed
df.select(col("day_interval") * col("multiplier").cast("double"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.MultiplyDTInterval`
**Related:**
- `MultiplyYMInterval` - Year-month interval multiplication
- `DivideYMInterval` - Year-month interval division
- `DivideDTInterval` - Day-time interval division
- Interval arithmetic expressions in Apache Spark SQL
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]