rdblue commented on a change in pull request #3745:
URL: https://github.com/apache/iceberg/pull/3745#discussion_r780825369



##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -781,10 +801,29 @@ public static TableIdentifier 
identifierToTableIdentifier(Identifier identifier)
             Object value = 
CatalystTypeConverters.convertToScala(catalystValue, field.dataType());
             values.put(field.name(), String.valueOf(value));
           });
-          return new SparkPartition(values, partition.path().toString(), 
format);
+          FileStatus fileStatus =
+              
scala.collection.JavaConverters.seqAsJavaListConverter(partition.files()).asJava().get(0);
+
+          return new SparkPartition(values, 
fileStatus.getPath().getParent().toString(), format);
         }).collect(Collectors.toList());
   }
 
+  private static List<org.apache.spark.sql.catalyst.expressions.Expression> 
getPartitionFilterExpressions(
+      SparkSession spark, String tableName, Map<String, String> 
partitionFilter) {
+    List<org.apache.spark.sql.catalyst.expressions.Expression> 
filterExpressions = Lists.newArrayList();
+    for (Map.Entry<String, String> entry : partitionFilter.entrySet()) {
+      String filter = entry.getKey() + " = '" + entry.getValue() + "'";
+      try {
+        org.apache.spark.sql.catalyst.expressions.Expression expression =
+            SparkExpressionConverter.collectResolvedSparkExpression(spark, 
tableName, filter);
+        filterExpressions.add(expression);
+      } catch (AnalysisException e) {
+        throw new IllegalArgumentException("filter " + filter + " cannot be 
converted to Spark expression");

Review comment:
       Minor: The exception message should follow the conventions for error 
messages:
   * Use sentence case. That is, capitalize the first word of the message.
   * State what went wrong first, "Cannot convert filter to Spark"
   * Next, give context after a `:`, which in this case is the filter
   * Never swallow cause exceptions
   
   This should be "throw new IllegalArgumentException("Cannot convert filter to 
Spark: " + filter, e)`

##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -781,10 +801,29 @@ public static TableIdentifier 
identifierToTableIdentifier(Identifier identifier)
             Object value = 
CatalystTypeConverters.convertToScala(catalystValue, field.dataType());
             values.put(field.name(), String.valueOf(value));
           });
-          return new SparkPartition(values, partition.path().toString(), 
format);
+          FileStatus fileStatus =
+              
scala.collection.JavaConverters.seqAsJavaListConverter(partition.files()).asJava().get(0);
+
+          return new SparkPartition(values, 
fileStatus.getPath().getParent().toString(), format);
         }).collect(Collectors.toList());
   }
 
+  private static List<org.apache.spark.sql.catalyst.expressions.Expression> 
getPartitionFilterExpressions(
+      SparkSession spark, String tableName, Map<String, String> 
partitionFilter) {
+    List<org.apache.spark.sql.catalyst.expressions.Expression> 
filterExpressions = Lists.newArrayList();
+    for (Map.Entry<String, String> entry : partitionFilter.entrySet()) {
+      String filter = entry.getKey() + " = '" + entry.getValue() + "'";
+      try {
+        org.apache.spark.sql.catalyst.expressions.Expression expression =
+            SparkExpressionConverter.collectResolvedSparkExpression(spark, 
tableName, filter);
+        filterExpressions.add(expression);
+      } catch (AnalysisException e) {
+        throw new IllegalArgumentException("filter " + filter + " cannot be 
converted to Spark expression");

Review comment:
       Minor: The exception message should follow the conventions for errors:
   * Use sentence case. That is, capitalize the first word of the message.
   * State what went wrong first, "Cannot convert filter to Spark"
   * Next, give context after a `:`, which in this case is the filter
   * Never swallow cause exceptions
   
   This should be "throw new IllegalArgumentException("Cannot convert filter to 
Spark: " + filter, e)`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to