andygrove opened a new issue, #3105:
URL: https://github.com/apache/datafusion-comet/issues/3105

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `local_timestamp` function, 
causing queries using this function to fall back to Spark's JVM execution 
instead of running natively on DataFusion.
   
   The `LocalTimestamp` expression returns the current timestamp without 
timezone information at the time of query execution. It provides a timestamp in 
the local timezone as a TimestampNTZ (timestamp without timezone) data type, 
representing the current date and time when the expression is evaluated.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   LOCALTIMESTAMP()
   -- or
   SELECT localtimestamp();
   ```
   
   ```scala
   // DataFrame API
   import org.apache.spark.sql.functions._
   df.select(expr("localtimestamp()"))
   ```
   
   **Arguments:**
   This expression takes no arguments. The `timeZoneId` parameter is used 
internally for timezone-aware processing but is not exposed to users.
   
   **Return Type:** `TimestampNTZType` - Timestamp without timezone 
information, represented as microseconds since epoch.
   
   **Supported Data Types:**
   This is a leaf expression that generates timestamp values and does not 
accept input data of any type.
   
   **Edge Cases:**
   - Never returns null (nullable = false)
   
   - Returns the same timestamp value for all rows within a single query 
execution due to constant folding
   
   - Timezone handling depends on the session's configured timezone
   
   - The timestamp represents the query execution time, not row processing time
   
   **Examples:**
   ```sql
   -- Get current local timestamp
   SELECT localtimestamp();
   -- Output: 2020-04-25 15:49:11.914
   
   -- Use in SELECT with other columns
   SELECT id, name, localtimestamp() as created_at FROM users;
   
   -- Use in WHERE clause for time-based filtering
   SELECT * FROM events WHERE event_time > localtimestamp() - INTERVAL 1 HOUR;
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   // Add current timestamp column
   df.withColumn("processed_at", expr("localtimestamp()"))
   
   // Filter using current timestamp
   df.filter(col("updated_at") > expr("localtimestamp() - INTERVAL 1 DAY"))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Medium
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.LocalTimestamp`
   
   **Related:**
   - `CurrentTimestamp` - Returns current timestamp with timezone information
   - `Now` - Alias for current timestamp
   - `UnixTimestamp` - Returns current time as Unix timestamp
   - `CurrentDate` - Returns current date without time component
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to