andygrove opened a new issue, #3095:
URL: https://github.com/apache/datafusion-comet/issues/3095

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `unix_date` function, causing 
queries using this function to fall back to Spark's JVM execution instead of 
running natively on DataFusion.
   
   The `UnixDate` expression converts a date value to its Unix date 
representation, which is the number of days since the Unix epoch (1970-01-01). 
This expression provides a way to get the integer representation of a date for 
numerical operations or storage optimization.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   unix_date(date_expr)
   ```
   
   ```scala
   // DataFrame API
   col("date_column").expr("unix_date(date_column)")
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | date_expr | DATE | The date value to convert to Unix date representation |
   
   **Return Type:** `INTEGER` - Returns the number of days since 1970-01-01 as 
an integer value.
   
   **Supported Data Types:**
   - `DateType` - Only date type inputs are supported
   
   **Edge Cases:**
   - **Null handling**: Returns null when input date is null (null-safe 
evaluation)
   - **Negative dates**: Dates before 1970-01-01 will return negative integer 
values
   - **Type enforcement**: Only accepts DATE type inputs, other types will 
cause compilation errors
   - **Integer overflow**: Extremely large date values could theoretically 
cause integer overflow
   
   **Examples:**
   ```sql
   -- Convert specific date to Unix date
   SELECT unix_date(DATE('1970-01-02'));
   -- Returns: 1
   
   -- Convert date column
   SELECT unix_date(birth_date) FROM users;
   
   -- Use in calculations
   SELECT unix_date(end_date) - unix_date(start_date) AS duration_days FROM 
events;
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   df.select(expr("unix_date(date_column)"))
   
   // In transformations
   df.withColumn("unix_date", expr("unix_date(original_date)"))
   
   // Calculate date differences
   df.select(
     expr("unix_date(end_date) - unix_date(start_date)").alias("duration")
   )
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.UnixDate`
   
   **Related:**
   - `to_date()` - Convert strings or timestamps to date type
   - `date_add()` - Add days to a date
   - `date_sub()` - Subtract days from a date
   - `unix_timestamp()` - Convert timestamp to Unix timestamp (seconds since 
epoch)
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to