andygrove opened a new issue, #3135:
URL: https://github.com/apache/datafusion-comet/issues/3135

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `from_unix_time` function, 
causing queries using this function to fall back to Spark's JVM execution 
instead of running natively on DataFusion.
   
   The `FromUnixTime` expression converts Unix timestamps (seconds since epoch) 
to formatted timestamp strings. It takes a Unix timestamp and an optional 
format pattern, returning a human-readable date-time string representation 
according to the specified format.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   FROM_UNIXTIME(unix_timestamp [, format])
   ```
   
   ```scala
   // DataFrame API
   col("timestamp_col").cast("long") // unix timestamp in seconds
   from_unixtime(col("unix_seconds"), "yyyy-MM-dd HH:mm:ss")
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | sec | Long | Unix timestamp in seconds since epoch (1970-01-01 00:00:00 
UTC) |
   | format | String | Optional format pattern string (defaults to 
TimestampFormatter.defaultPattern()) |
   | timeZoneId | String | Optional timezone identifier for formatting 
(internal parameter) |
   
   **Return Type:** Returns `StringType` - A UTF-8 encoded string 
representation of the formatted timestamp.
   
   **Supported Data Types:**
   - **sec parameter**: `LongType` only
   - **format parameter**: `StringType` with collation support (supports trim 
collation)
   - Implicit casting is supported for input types through 
`ImplicitCastInputTypes`
   
   **Edge Cases:**
   - **Null handling**: Returns null if either input argument is null 
(null-intolerant behavior)
   - **Invalid format patterns**: May throw runtime exceptions for malformed 
format strings
   - **Timestamp overflow**: Large Unix timestamps may cause formatting errors 
or unexpected results
   - **Timezone handling**: Uses system default timezone when no explicit 
timezone is provided
   - **Negative timestamps**: Supports negative Unix timestamps (dates before 
1970-01-01)
   
   **Examples:**
   ```sql
   -- Basic usage with default format
   SELECT FROM_UNIXTIME(1672531200) AS formatted_time;
   -- Result: "2023-01-01 00:00:00"
   
   -- Custom format pattern
   SELECT FROM_UNIXTIME(1672531200, 'yyyy/MM/dd HH:mm:ss') AS custom_format;
   -- Result: "2023/01/01 00:00:00"
   
   -- Handle null values
   SELECT FROM_UNIXTIME(NULL) AS null_result;
   -- Result: NULL
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   // Basic conversion
   df.select(from_unixtime(col("unix_seconds")))
   
   // Custom format
   df.select(from_unixtime(col("unix_seconds"), "dd/MM/yyyy HH:mm"))
   
   // With timezone handling
   df.select(from_unixtime(col("unix_seconds")).cast("timestamp"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.FromUnixTime`
   
   **Related:**
   - `UnixTimestamp` - Converts formatted timestamp strings back to Unix 
timestamps
   - `ToTimestamp` - Converts strings to timestamp data type
   - `DateFormatClass` - Related date formatting expressions
   - `TimestampFormatter` - Underlying formatter implementation
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to