andygrove opened a new issue, #3098:
URL: https://github.com/apache/datafusion-comet/issues/3098
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `make_dt_interval` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `MakeDTInterval` expression creates a day-time interval value from
separate day, hour, minute, and second components. This expression is used to
construct `DayTimeIntervalType` values programmatically by combining individual
time unit values into a single interval representation.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
make_dt_interval(days, hours, minutes, seconds)
make_dt_interval(days, hours, minutes)
make_dt_interval(days, hours)
make_dt_interval(days)
make_dt_interval()
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| `days` | IntegerType | Number of days in the interval (optional, defaults
to 0) |
| `hours` | IntegerType | Number of hours in the interval (optional,
defaults to 0) |
| `minutes` | IntegerType | Number of minutes in the interval (optional,
defaults to 0) |
| `seconds` | DecimalType(MAX_LONG_DIGITS, 6) | Number of seconds including
microsecond precision (optional, defaults to 0) |
**Return Type:** Returns a `DayTimeIntervalType()` representing the
constructed interval.
**Supported Data Types:**
- **days**: Integer values
- **hours**: Integer values
- **minutes**: Integer values
- **seconds**: Decimal values with up to 6 decimal places for microsecond
precision
**Edge Cases:**
- **Null handling**: Expression is null-intolerant (`nullIntolerant =
true`), meaning if any input is null, the result is null
- **Default values**: Missing parameters default to 0 (literal values)
- **Precision handling**: Seconds parameter uses DecimalType with 6 decimal
places to preserve microsecond precision
- **Overflow behavior**: Delegates to `IntervalUtils.makeDayTimeInterval()`
for overflow validation and error handling
- **Error context**: Includes query context information for meaningful error
messages when interval construction fails
**Examples:**
```sql
-- Create a 5-day, 3-hour, 30-minute, 45.5-second interval
SELECT make_dt_interval(5, 3, 30, 45.5);
-- Create a 2-day interval
SELECT make_dt_interval(2);
-- Create a 1-day, 12-hour interval
SELECT make_dt_interval(1, 12);
-- Create an empty interval
SELECT make_dt_interval();
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
// Create interval from literal values
df.select(expr("make_dt_interval(5, 3, 30, 45.5)"))
// Create interval from column values
df.select(expr("make_dt_interval(day_col, hour_col, min_col, sec_col)"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Large
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.MakeDTInterval`
**Related:**
- `MakeYMInterval` - Creates year-month intervals
- `IntervalUtils` - Utility class for interval operations
- `DayTimeIntervalType` - The data type returned by this expression
- `Extract` - Extracts components from interval values
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]