andygrove opened a new issue, #3162:
URL: https://github.com/apache/datafusion-comet/issues/3162
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `get_json_object` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `GetJsonObject` expression extracts JSON values from a JSON string using
JSONPath expressions. It takes a JSON string and a JSONPath query, returning
the matching value(s) as a string representation.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
get_json_object(json_string, path)
```
```scala
// DataFrame API usage
col("json_column").getItem(path)
// or using expr()
expr("get_json_object(json_column, '$.field')")
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| json | Expression | The JSON string to query against |
| path | Expression | The JSONPath expression to extract values |
**Return Type:** Returns `StringType` - the extracted JSON value as a string
representation.
**Supported Data Types:**
- **json parameter**: String type containing valid JSON
- **path parameter**: String type containing valid JSONPath expressions
**Edge Cases:**
- **Null handling**: Returns null if either json or path parameters are null
- **Invalid JSON**: Returns null for malformed JSON input strings
- **Invalid JSONPath**: Returns null for syntactically incorrect JSONPath
expressions
- **No matches**: Returns null when the JSONPath doesn't match any elements
- **Multiple matches**: For array results, returns JSON array string
representation
**Examples:**
```sql
-- Extract simple field
SELECT get_json_object('{"name":"John","age":30}', '$.name');
-- Result: "John"
-- Extract from array
SELECT get_json_object('[{"a":"b"},{"a":"c"}]', '$[*].a');
-- Result: ["b","c"]
-- Extract nested field
SELECT get_json_object('{"user":{"profile":{"name":"Alice"}}}',
'$.user.profile.name');
-- Result: "Alice"
-- Array element access
SELECT get_json_object('{"items":["apple","banana","cherry"]}',
'$.items[1]');
-- Result: "banana"
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(get_json_object(col("json_data"), "$.field"))
// Using expr for complex JSONPath
df.select(expr("get_json_object(json_column, '$[*].nested.field')"))
// Multiple extractions
df.select(
get_json_object(col("json_data"), "$.name").alias("name"),
get_json_object(col("json_data"), "$.age").alias("age")
)
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Large
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.GetJsonObject`
**Related:**
- `json_tuple` - Extract multiple JSON fields in a single operation
- `from_json` - Parse JSON string into structured data types
- `to_json` - Convert structured data to JSON strings
- `json_array_length` - Get length of JSON arrays
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]