andygrove opened a new issue, #3130:
URL: https://github.com/apache/datafusion-comet/issues/3130

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `seconds_of_time_with_fraction` 
function, causing queries using this function to fall back to Spark's JVM 
execution instead of running natively on DataFusion.
   
   SecondsOfTimeWithFraction extracts the seconds component from a time value 
and returns it as a decimal with fractional precision. This expression is 
implemented as a RuntimeReplaceable that delegates to DateTimeUtils for the 
actual computation, preserving subsecond precision based on the input time 
type's precision.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   -- SQL syntax (function name may vary by implementation)
   SECONDS_WITH_FRACTION(time_expression)
   ```
   
   ```scala
   // DataFrame API usage
   SecondsOfTimeWithFraction(timeColumn)
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | child | Expression | The time expression from which to extract seconds 
with fractional component |
   
   **Return Type:** DecimalType(8, 6) - A decimal with precision 8 and scale 6, 
allowing values up to 99.999999 seconds.
   
   **Supported Data Types:**
   - TimeType with any precision
   - Any type conforming to AnyTimeType abstract data type
   
   **Edge Cases:**
   - Null input values are handled by the underlying DateTimeUtils 
implementation
   - Non-TimeType inputs default to minimum precision (TimeType.MIN_PRECISION)
   - Fractional seconds are preserved up to microsecond precision (6 decimal 
places)
   - Invalid time values may result in null or error depending on DateTimeUtils 
behavior
   
   **Examples:**
   ```sql
   -- Extract seconds with fractional component from time
   SELECT SECONDS_WITH_FRACTION(TIME '14:30:25.123456') AS seconds_fraction;
   -- Result: 25.123456
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.catalyst.expressions.SecondsOfTimeWithFraction
   
   val df = spark.range(1).select(
     SecondsOfTimeWithFraction(col("time_column")).as("seconds_with_fraction")
   )
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.SecondsOfTimeWithFraction`
   
   **Related:**
   - TimeExpression - Base trait for time-related expressions
   - RuntimeReplaceable - Interface for expressions replaced at runtime
   - DateTimeUtils - Utility class containing time computation methods
   - StaticInvoke - Expression for calling static methods with code generation
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to