andygrove opened a new issue, #3157:
URL: https://github.com/apache/datafusion-comet/issues/3157
## What is the problem the feature request solves?
> **Note:** This issue was generated with AI assistance. The specification
details have been extracted from Spark documentation and may need verification.
Comet does not currently support the Spark `array_position` function,
causing queries using this function to fall back to Spark's JVM execution
instead of running natively on DataFusion.
The `ArrayPosition` expression finds the position of the first occurrence of
a specified element within an array. It returns a 1-based index of the
element's position, or 0 if the element is not found in the array.
Supporting this expression would allow more Spark workloads to benefit from
Comet's native acceleration.
## Describe the potential solution
### Spark Specification
**Syntax:**
```sql
array_position(array, element)
```
```scala
// DataFrame API usage
import org.apache.spark.sql.functions._
df.select(array_position(col("array_column"), lit(value)))
```
**Arguments:**
| Argument | Type | Description |
|----------|------|-------------|
| array | ArrayType | The input array to search within |
| element | Any (matching array element type) | The element to find within
the array |
**Return Type:** `LongType` - Returns a 1-based position index as a long
integer, or 0 if the element is not found.
**Supported Data Types:**
- Array element types must be orderable (support comparison operations)
- The search element type must be compatible with the array's element type
through type coercion
- Null types are explicitly rejected and will cause a type mismatch error
**Edge Cases:**
- **Null array input**: Returns null due to `nullIntolerant = true`
- **Null search element**: Returns null due to `nullIntolerant = true`
- **Null elements in array**: Skipped during comparison, never match the
search element
- **Empty array**: Returns 0 (no elements to match)
- **Element type mismatch**: Compile-time error with detailed type mismatch
information
- **Multiple occurrences**: Only returns the position of the first occurrence
**Examples:**
```sql
-- Find position of element in array
SELECT array_position(array(312, 773, 708, 708), 414);
-- Returns: 0
SELECT array_position(array(312, 773, 708, 708), 773);
-- Returns: 2
SELECT array_position(array('a', 'b', 'c', 'b'), 'b');
-- Returns: 2 (first occurrence)
-- With null values
SELECT array_position(array(1, null, 3, null), 3);
-- Returns: 3
```
```scala
// DataFrame API examples
import org.apache.spark.sql.functions._
// Find position of specific value
df.select(array_position(col("numbers"), lit(42)))
// Find position with column reference
df.select(array_position(col("items"), col("search_value")))
// Using in filter conditions
df.filter(array_position(col("tags"), lit("important")) > 0)
```
### Implementation Approach
See the [Comet guide on adding new
expressions](https://datafusion.apache.org/comet/contributor-guide/adding_a_new_expression.html)
for detailed instructions.
1. **Scala Serde**: Add expression handler in
`spark/src/main/scala/org/apache/comet/serde/`
2. **Register**: Add to appropriate map in `QueryPlanSerde.scala`
3. **Protobuf**: Add message type in `native/proto/src/proto/expr.proto` if
needed
4. **Rust**: Implement in `native/spark-expr/src/` (check if DataFusion has
built-in support first)
## Additional context
**Difficulty:** Medium
**Spark Expression Class:**
`org.apache.spark.sql.catalyst.expressions.ArrayPosition`
**Related:**
- `array_contains` - Check if array contains an element (boolean result)
- `element_at` - Get element at specific position in array
- `array_remove` - Remove all occurrences of element from array
- `size` - Get the size/length of an array
---
*This issue was auto-generated from Spark reference documentation.*
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]