rdblue commented on code in PR #7886:
URL: https://github.com/apache/iceberg/pull/7886#discussion_r1277787358
##########
spark/v3.4/spark/src/main/java/org/apache/iceberg/spark/SparkV2Filters.java:
##########
@@ -98,6 +110,18 @@ public class SparkV2Filters {
private SparkV2Filters() {}
+ public static Expression convert(Predicate[] predicates) {
+ Expression expression = Expressions.alwaysTrue();
+ for (Predicate predicate : predicates) {
+ Expression converted = convert(predicate);
+ Preconditions.checkArgument(
+ converted != null, "Cannot convert Spark predicate to Iceberg
expression: %s", predicate);
Review Comment:
Is this the correct behavior?
I think the reason why we didn't have bulk conversion was so that the caller
could handle predicates that couldn't be converted individually. In most cases,
we want to push as many predicates as possible. Failing the entire conversion
with an exception because one couldn't be converted isn't a good option because
we then can't push the predicates that can be converted and let Spark or other
engines handle the rest.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]