andygrove opened a new issue, #3134:
URL: https://github.com/apache/datafusion-comet/issues/3134

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `datetime_sub` function, causing 
queries using this function to fall back to Spark's JVM execution instead of 
running natively on DataFusion.
   
   DatetimeSub is a runtime-replaceable expression that subtracts an interval 
from a timestamp or date value. It serves primarily as a SQL presentation layer 
for datetime subtraction operations, providing a clean string representation 
while delegating actual computation to its replacement expression.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   datetime_column - INTERVAL value unit
   timestamp_column - INTERVAL '1' DAY
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | start | Expression | The timestamp or date value from which to subtract |
   | interval | Expression | The interval value to subtract from the start 
datetime |
   | replacement | Expression | The underlying expression that performs the 
actual computation |
   
   **Return Type:** The return type depends on the replacement expression, but 
typically returns:
   
   - `TimestampType` when subtracting from timestamp values
   - `DateType` when subtracting from date values
   
   **Supported Data Types:**
   - **start**: `TimestampType`, `DateType`
   - **interval**: `CalendarIntervalType`, interval literals
   - Input types are validated by the replacement expression during analysis
   
   **Edge Cases:**
   - **Null handling**: Behavior depends on the replacement expression, 
typically follows SQL null semantics
   - **Invalid intervals**: Runtime errors may occur for malformed interval 
expressions
   - **Overflow scenarios**: Large interval subtractions may cause timestamp 
overflow
   - **Type mismatches**: Analysis phase will validate compatible types between 
start and interval
   
   **Examples:**
   ```sql
   -- Subtract 1 day from current timestamp
   SELECT current_timestamp() - INTERVAL '1' DAY;
   
   -- Subtract multiple units
   SELECT date_col - INTERVAL '1 year 2 months 3 days' FROM table;
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   df.select(col("timestamp_col") - expr("INTERVAL '1' HOUR"))
   df.withColumn("yesterday", col("date_col") - expr("INTERVAL '1' DAY"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.DatetimeSub`
   
   **Related:**
   - `DatetimeAdd` - Addition counterpart for datetime arithmetic
   - `TimeAdd` - Time-specific addition operations
   - `DateAdd` - Date-specific addition operations
   - `RuntimeReplaceable` - Base trait for expressions with compile-time 
replacements
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to