andygrove opened a new issue, #3119:
URL: https://github.com/apache/datafusion-comet/issues/3119

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `make_time` function, causing 
queries using this function to fall back to Spark's JVM execution instead of 
running natively on DataFusion.
   
   The `MakeTime` expression creates a time value from separate hour, minute, 
and second components. It constructs a time of day by combining integer hour 
and minute values with a decimal seconds value that can include microsecond 
precision.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   make_time(hours, minutes, seconds)
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | hours | IntegerType | The hour component (0-23) |
   | minutes | IntegerType | The minute component (0-59) |
   | secsAndMicros | DecimalType(16, 6) | The seconds component with 
microsecond precision (0-59.999999) |
   
   **Return Type:** Returns `TimeType` with microsecond precision 
(`TimeType.MICROS_PRECISION`).
   
   **Supported Data Types:**
   - Hours: Integer types that can be cast to `IntegerType`
   
   - Minutes: Integer types that can be cast to `IntegerType`
   
   - Seconds: Numeric types that can be cast to `DecimalType(16, 6)`
   
   **Edge Cases:**
   - **Null handling**: If any input parameter is NULL, the result is NULL
   
   - **Invalid time components**: Behavior depends on the underlying 
`DateTimeUtils.makeTime` implementation for out-of-range values
   
   - **Precision handling**: The `DecimalType(16, 6)` ensures microsecond 
precision is maintained for the seconds component
   
   - **Type casting**: Integer seconds values are safely cast to decimal 
without precision loss due to the wide decimal type used
   
   **Examples:**
   ```sql
   -- Create a time for 14:30:45.123456
   SELECT make_time(14, 30, 45.123456);
   
   -- Create midnight
   SELECT make_time(0, 0, 0);
   
   -- Create a time with microsecond precision
   SELECT make_time(23, 59, 59.999999);
   ```
   
   ```scala
   // Example DataFrame API usage
   import org.apache.spark.sql.functions._
   
   df.select(expr("make_time(14, 30, 45.123456)"))
   
   // Using column references
   df.select(expr("make_time(hour_col, minute_col, second_col)"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.MakeTime`
   
   **Related:**
   - `MakeDate` - Creates date values from components
   
   - `MakeTimestamp` - Creates timestamp values from date and time components
   
   - Time extraction functions like `hour()`, `minute()`, `second()`
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to