andygrove opened a new issue, #3185:
URL: https://github.com/apache/datafusion-comet/issues/3185

   ## What is the problem the feature request solves?
   
   > **Note:** This issue was generated with AI assistance. The specification 
details have been extracted from Spark documentation and may need verification.
   
   Comet does not currently support the Spark `url_encode` function, causing 
queries using this function to fall back to Spark's JVM execution instead of 
running natively on DataFusion.
   
   The `UrlEncode` expression performs URL encoding (percent encoding) on a 
string input, converting special characters to their percent-encoded 
equivalents. This expression is implemented as a runtime replaceable expression 
that delegates to the `UrlCodec.encode` method for the actual encoding logic.
   
   Supporting this expression would allow more Spark workloads to benefit from 
Comet's native acceleration.
   
   ## Describe the potential solution
   
   ### Spark Specification
   
   **Syntax:**
   ```sql
   url_encode(str)
   ```
   
   ```scala
   // DataFrame API
   import org.apache.spark.sql.functions._
   df.select(url_encode(col("url_column")))
   ```
   
   **Arguments:**
   | Argument | Type | Description |
   |----------|------|-------------|
   | str | String | The input string to be URL encoded |
   
   **Return Type:** Returns a `String` containing the URL-encoded version of 
the input string.
   
   **Supported Data Types:**
   - String types with collation support (specifically 
`StringTypeWithCollation` with trim collation support)
   
   **Edge Cases:**
   - **Null handling**: Follows standard Spark null propagation - null input 
returns null output
   - **Empty string**: Empty strings are processed normally and return empty 
strings
   - **Already encoded strings**: No special handling - characters like `%` 
will be encoded again (e.g., `%` becomes `%25`)
   - **Unicode characters**: Unicode characters are properly encoded according 
to URL encoding standards
   
   **Examples:**
   ```sql
   -- Basic URL encoding
   SELECT url_encode('https://spark.apache.org') AS encoded_url;
   -- Returns: https%3A%2F%2Fspark.apache.org
   
   -- Encoding special characters
   SELECT url_encode('hello world!@#$%') AS encoded_special;
   -- Returns: hello+world%21%40%23%24%25
   
   -- Handling null values
   SELECT url_encode(NULL) AS encoded_null;
   -- Returns: NULL
   ```
   
   ```scala
   // DataFrame API usage
   import org.apache.spark.sql.functions._
   
   val df = Seq(
     "https://spark.apache.org";,
     "hello world!",
     null
   ).toDF("url")
   
   df.select(url_encode(col("url")).alias("encoded_url")).show()
   
   // Direct usage in transformations
   df.withColumn("encoded_url", url_encode(col("url")))
   ```
   
   ### Implementation Approach
   
   See the [Comet guide on adding new 
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
 for detailed instructions.
   
   1. **Scala Serde**: Add expression handler in 
`spark/src/main/scala/org/apache/comet/serde/`
   2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
   3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if 
needed
   4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has 
built-in support first)
   
   
   ## Additional context
   
   **Difficulty:** Large
   **Spark Expression Class:** 
`org.apache.spark.sql.catalyst.expressions.UrlEncode`
   
   **Related:**
   - `UrlDecode` - For decoding URL-encoded strings
   - `Base64` - For Base64 encoding operations
   - String manipulation functions in the `url_funcs` group
   
   ---
   *This issue was auto-generated from Spark reference documentation.*
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to