huaxingao commented on a change in pull request #3745:
URL: https://github.com/apache/iceberg/pull/3745#discussion_r774690318
##########
File path:
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -781,10 +801,29 @@ public static TableIdentifier
identifierToTableIdentifier(Identifier identifier)
Object value =
CatalystTypeConverters.convertToScala(catalystValue, field.dataType());
values.put(field.name(), String.valueOf(value));
});
- return new SparkPartition(values, partition.path().toString(),
format);
+ FileStatus fileStatus =
+
scala.collection.JavaConverters.seqAsJavaListConverter(partition.files()).asJava().get(0);
+
+ return new SparkPartition(values,
fileStatus.getPath().getParent().toString(), format);
}).collect(Collectors.toList());
}
+ private static List<org.apache.spark.sql.catalyst.expressions.Expression>
getPartitionFilterExpressions(
+ SparkSession spark, String tableName, Map<String, String>
partitionFilter) {
+ List<org.apache.spark.sql.catalyst.expressions.Expression>
filterExpressions = Lists.newArrayList();
+ for (Map.Entry<String, String> entry : partitionFilter.entrySet()) {
+ String filter = entry.getKey() + " = '" + entry.getValue() + "'";
Review comment:
If the filter is on String column, e.g. `dept` column with value `hr`,
we want the filter to be `dept = 'hr'`.
For non-String columns, such as `id = 3`, the filter with `id = '3'` still
works ok because Spark will have `Literal` with String value to begin with, and
then cast to `Literal` with Int value after it has the column type.
`TestAddFilesProcedure` has both String and int type partition columns so
these two are tested. We probably should test other types e.g. Timestamp just
to make sure.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]