rdblue commented on a change in pull request #749: Convert Spark In filter to 
iceberg IN Expression
URL: https://github.com/apache/incubator-iceberg/pull/749#discussion_r371388439
 
 

 ##########
 File path: spark/src/main/java/org/apache/iceberg/spark/SparkFilters.java
 ##########
 @@ -122,11 +122,7 @@ public static Expression convert(Filter filter) {
 
         case IN:
           In inFilter = (In) filter;
-          Expression in = alwaysFalse();
-          for (Object value : inFilter.values()) {
-            in = or(in, equal(inFilter.attribute(), convertLiteral(value)));
-          }
-          return in;
+          return in(inFilter.attribute(), inFilter.values());
 
 Review comment:
   This dropped the call to `convertLiteral` for each value. That converts 
values that Spark uses into Iceberg values. I think we still need to call it:
   
   ```java
             In inFilter = (In) filter;
             List<Object> nonNullLiterals = Stream.of(inFilter.values())
                 .filter(Objects::nonNull)
                 .map(SparkFilters::convertLiteral)
                 .collect(Collectors.toList());
             boolean hasNull = 
Stream.of(inFilter.values()).anyMatch(Objects::isNull);
             Expression in = Expressions.in(inFilter.attribute(), 
nonNullLiterals);
             if (hasNull) {
               return or(isNull(inFilter.attribute()), in);
             } else {
               return in;
             }
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to