andygrove opened a new issue, #3168:
URL: https://github.com/apache/datafusion-comet/issues/3168
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `string_to_map` function, causing
queries using this function to fall back to Spark's JVM execution instead of
running natively on DataFusion.
StringToMap is a Spark Catalyst expression that converts a string
representation into a MapType by parsing key-value pairs using configurable
delimiters. It takes a text string and splits it into map entries based on
specified pair and key-value delimiters.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
str_to_map(text, pairDelim, keyValueDelim)
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| text | Expression | The input string to be parsed into a map |
| pairDelim | Expression | The delimiter used to separate key-value pairs |
| keyValueDelim | Expression | The delimiter used to separate keys from
values within each pair |
**Return Type:** MapType with StringType keys and StringType values
(Map[String, String])
**Supported Data Types:**
- **text**: StringType or expressions that can be cast to StringType
- **pairDelim**: StringType literals or expressions
- **keyValueDelim**: StringType literals or expressions
**Edge Cases:**
- **Null handling**: Returns null when the input text is null
- **Empty input**: Returns an empty map when input string is empty
- **Missing delimiters**: Handles cases where delimiters are not found in
the expected positions
- **Duplicate keys**: Later occurrences of the same key may overwrite
earlier ones
- **Malformed pairs**: Pairs that don't contain the key-value delimiter may
be skipped or result in null values
- **Empty keys or values**: Supports empty strings as valid keys or values
**Examples:**
```sql
-- Basic usage with comma and colon delimiters
SELECT str_to_map('key1:value1,key2:value2', ',', ':') AS result;
-- Returns: {"key1":"value1", "key2":"value2"}
-- Using different delimiters
SELECT str_to_map('a=1;b=2;c=3', ';', '=') AS result;
-- Returns: {"a":"1", "b":"2", "c":"3"}
-- Handling null values in the result
SELECT str_to_map('a:,b:value2', ',', ':') AS result;
-- Returns: {"a":null, "b":"value2"}
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(expr("str_to_map(text_column, ',', ':')").as("parsed_map"))
// Using with column references
df.select(expr("str_to_map(input_text, pair_delim_col,
kv_delim_col)").as("result"))
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.StringToMap`
**Related:**
- **MapType**: The return type of this expression
- **CreateMap**: Expression for creating maps from explicit key-value pairs
- **MapKeys/MapValues**: Functions for extracting keys or values from maps
- **Split**: Related string splitting functionality
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]