huaxingao commented on a change in pull request #3745:
URL: https://github.com/apache/iceberg/pull/3745#discussion_r783504448



##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -781,7 +800,11 @@ public static TableIdentifier 
identifierToTableIdentifier(Identifier identifier)
             Object value = 
CatalystTypeConverters.convertToScala(catalystValue, field.dataType());
             values.put(field.name(), String.valueOf(value));
           });
-          return new SparkPartition(values, partition.path().toString(), 
format);
+
+          FileStatus fileStatus =
+              
JavaConverters.seqAsJavaListConverter(partition.files()).asJava().get(0);

Review comment:
       Because here partition is PartitionDirectory 
   ```
   case class PartitionDirectory(values: InternalRow, files: Seq[FileStatus])
   ```
   listFiles returns a Seq of PartitionDirectory
   ```
     def listFiles(
         partitionFilters: Seq[Expression], dataFilters: Seq[Expression]): 
Seq[PartitionDirectory]
   ```
   
   
   Before my change, partition is PartitionPath
   ```
   case class PartitionPath(values: InternalRow, path: Path)
   ```

##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/SparkUtil.java
##########
@@ -179,4 +191,63 @@ public static Configuration 
hadoopConfCatalogOverrides(SparkSession spark, Strin
   private static String hadoopConfPrefixForCatalog(String catalogName) {
     return String.format(SPARK_CATALOG_HADOOP_CONF_OVERRIDE_FMT_STR, 
catalogName);
   }
+
+  /**
+   * Get a List of Spark filter Expression.
+   *
+   * @param schema table schema
+   * @param filters filters in the format of a Map, where key is one of the 
table column name,
+   *                and value is the specific value to be filtered on the 
column.
+   * @return a List of filters in the format of Spark Expression.
+   */
+  public static List getSparkFilterExpressions(StructType schema,

Review comment:
       Fixed. Thanks!

##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/SparkUtil.java
##########
@@ -179,4 +191,63 @@ public static Configuration 
hadoopConfCatalogOverrides(SparkSession spark, Strin
   private static String hadoopConfPrefixForCatalog(String catalogName) {
     return String.format(SPARK_CATALOG_HADOOP_CONF_OVERRIDE_FMT_STR, 
catalogName);
   }
+
+  /**
+   * Get a List of Spark filter Expression.
+   *
+   * @param schema table schema
+   * @param filters filters in the format of a Map, where key is one of the 
table column name,
+   *                and value is the specific value to be filtered on the 
column.
+   * @return a List of filters in the format of Spark Expression.
+   */
+  public static List getSparkFilterExpressions(StructType schema,

Review comment:
       Fixed. Thanks!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to