huaxingao commented on a change in pull request #3745:
URL: https://github.com/apache/iceberg/pull/3745#discussion_r780876467



##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -781,10 +801,29 @@ public static TableIdentifier 
identifierToTableIdentifier(Identifier identifier)
             Object value = 
CatalystTypeConverters.convertToScala(catalystValue, field.dataType());
             values.put(field.name(), String.valueOf(value));
           });
-          return new SparkPartition(values, partition.path().toString(), 
format);
+          FileStatus fileStatus =
+              
scala.collection.JavaConverters.seqAsJavaListConverter(partition.files()).asJava().get(0);
+
+          return new SparkPartition(values, 
fileStatus.getPath().getParent().toString(), format);
         }).collect(Collectors.toList());
   }
 
+  private static List<org.apache.spark.sql.catalyst.expressions.Expression> 
getPartitionFilterExpressions(
+      SparkSession spark, String tableName, Map<String, String> 
partitionFilter) {
+    List<org.apache.spark.sql.catalyst.expressions.Expression> 
filterExpressions = Lists.newArrayList();
+    for (Map.Entry<String, String> entry : partitionFilter.entrySet()) {
+      String filter = entry.getKey() + " = '" + entry.getValue() + "'";
+      try {
+        org.apache.spark.sql.catalyst.expressions.Expression expression =
+            SparkExpressionConverter.collectResolvedSparkExpression(spark, 
tableName, filter);

Review comment:
       @rdblue Thank you very much for reviewing my PR on the weekend!
   
   Do you mean constructing a filter `Expression` instead of letting Spark 
generate the `Expression`? I initially generated  `Expression` like this
   ```
           BoundReference ref = new BoundReference(index, dataType, true);
           switch (dataType.typeName()) {
             case "integer":
               filterExpressions.add(new EqualTo(ref,
                   
org.apache.spark.sql.catalyst.expressions.Literal.create(Integer.parseInt(entry.getValue()),
                   DataTypes.IntegerType)));
               break;
   ```
   There are some concerns from the reviewers because we need to test each of 
the data types. Then I changed the code to call 
`collectResolvedSparkExpression`.
   
   I will address all the other comments after I find out what to do for this 
one, so I can fix all the problems in one commit.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to