scovich opened a new issue, #9606: URL: https://github.com/apache/arrow-rs/issues/9606
I think spark is just implementing [jsonpath semantics](https://www.rfc-editor.org/rfc/rfc9535#name-semantics): > A syntactically valid segment MUST NOT produce errors when executing the query. This means that some operations that might be considered erroneous, such as using an index lying outside the range of an array, simply result in fewer nodes being selected. Here, "syntactically valid" is referring to the previous section (2.1): > A JSONPath implementation MUST raise an error for any query that is not well-formed and valid. The well-formedness and the validity of JSONPath queries are independent of the JSON value the query is applied to. No further errors relating to the well-formedness and the validity of a JSONPath query can be raised during application of the query to a value. This clearly separates well-formedness/validity errors in the query from mismatches that may actually stem from flaws in the data. Note: Integer overflow in an index is well-formed but not valid, so it's allowed to produce an error. _Originally posted by @scovich in https://github.com/apache/arrow-rs/pull/8354#discussion_r2862675559_ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
