andygrove opened a new issue, #3120:
URL: https://github.com/apache/datafusion-comet/issues/3120

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `to_time` function, causing 
queries using this function to fall back to Spark's JVM execution instead of 
running natively on DataFusion.
   
   The `ToTime` expression converts a string representation of time into a 
`TimeType` value. It supports both default parsing and custom format-based 
parsing using optional format strings, and serves as a runtime replaceable 
expression that delegates actual parsing to a specialized `ToTimeParser` class.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   to_time(str)
   to_time(str, format)
   ```
   
   ```scala
   // DataFrame API usage
   ToTime(stringExpr)
   ToTime(stringExpr, formatExpr)
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | `str` | String | The input string containing the time value to be parsed |
   | `format` | String (Optional) | Optional format pattern specifying how to 
parse the input string |
   
   **Return Type:** `TimeType` - A Spark SQL time data type representing time 
values.
   
   **Supported Data Types:**
   - **Input**: `StringTypeWithCollation` (supports trim collation for both 
string argument and optional format argument)
   - **Output**: `TimeType`
   
   **Edge Cases:**
   - **Null input string**: Returns null result
   - **Null format**: When format expression evaluates to null, returns null 
literal
   - **Invalid format**: Handled by underlying `ToTimeParser` implementation
   - **Unparseable time strings**: Behavior depends on `ToTimeParser` 
implementation
   - **Non-foldable format**: Falls back to runtime format evaluation for 
dynamic format expressions
   
   **Examples:**
   ```sql
   -- Parse time with default format
   SELECT to_time('12:10:05');
   
   -- Parse time with custom format
   SELECT to_time('10:05 AM', 'HH:mm a');
   
   -- Handle null inputs
   SELECT to_time(NULL);
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   // Default parsing
   df.select(expr("to_time(time_string)"))
   
   // With format
   df.select(expr("to_time(time_string, 'HH:mm:ss')"))
   
   // Direct expression usage
   val timeExpr = ToTime(col("time_column").expr)
   val timeWithFormatExpr = ToTime(col("time_column").expr, lit("HH:mm").expr)
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.ToTime`
   
   **Related:**
   - Time-related expressions and functions
   - `TimeType` data type documentation
   - Date/time parsing expressions
   - `RuntimeReplaceable` expression pattern
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to