kbendick commented on a change in pull request #3545:
URL: https://github.com/apache/iceberg/pull/3545#discussion_r752898624



##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -301,7 +302,9 @@ public static Term toIcebergTerm(Expression expr) {
       Transform transform = (Transform) expr;
       Preconditions.checkArgument(transform.references().length == 1,
           "Cannot convert transform with more than one column reference: %s", 
transform);
-      String colName = DOT.join(transform.references()[0].fieldNames());
+      List<String> fieldNames = 
Arrays.stream(transform.references()[0].fieldNames())
+              .map(Spark3Util::toIcebergField).collect(Collectors.toList());

Review comment:
       Nit: This is over indented. The dot in `.map` should line up with the 
first double quote in the new line 304 (so 4 spaces indented past List).

##########
File path: 
spark/v3.2/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java
##########
@@ -448,6 +451,19 @@ public static boolean extensionsEnabled(SparkSession 
spark) {
     return extensions.contains("IcebergSparkSessionExtensions");
   }
 
+  public static String toIcebergField(String name) {
+    Preconditions.checkArgument(!name.isEmpty(), "Invalid column name: 
(empty)");
+    if (!isCaseSensitiveEnabled()) {
+      return name.toLowerCase(Locale.ROOT);
+    }
+    return name;
+  }
+
+  public static boolean isCaseSensitiveEnabled() {
+    return Boolean.parseBoolean(
+            SparkSession.active().conf().get("spark.sql.caseSensitive", 
"false"));

Review comment:
       Nit: Same note about over-indentation. It should be aligned with the 
second `r` in return.

##########
File path: 
spark/v3.2/spark-extensions/src/test/java/org/apache/iceberg/spark/extensions/TestSetWriteDistributionAndOrdering.java
##########
@@ -279,4 +281,40 @@ public void 
testSetWriteDistributedAndLocallyOrderedInverted() {
         .build();
     Assert.assertEquals("Sort order must match", expected, table.sortOrder());
   }
+
+  @Test
+  public void testSetWriteOrderByColumnWithCaseInsensitive() {
+    sql("CREATE TABLE %s (id bigint NOT NULL, category string, ts timestamp, 
data string) USING iceberg", tableName);
+    Table table = validationCatalog.loadTable(tableIdent);
+    Assert.assertTrue("Table should start unsorted", 
table.sortOrder().isUnsorted());
+
+    spark.conf().set("spark.sql.caseSensitive", "false");
+    sql("ALTER TABLE %s WRITE ORDERED BY CATEGORY, Id", tableName);
+
+    table.refresh();
+
+    String distributionMode = 
table.properties().get(TableProperties.WRITE_DISTRIBUTION_MODE);
+    Assert.assertEquals("Distribution mode must match", "range", 
distributionMode);
+
+    SortOrder expected = SortOrder.builderFor(table.schema())
+            .withOrderId(1)
+            .asc("category", NullOrder.NULLS_FIRST)
+            .asc("id", NullOrder.NULLS_FIRST)
+            .build();
+    Assert.assertEquals("Should have expected order", expected, 
table.sortOrder());
+  }
+
+  @Test
+  public void testSetWriteOrderByColumnWithCaseSensitive() {
+    sql("CREATE TABLE %s (id bigint NOT NULL, category string, ts timestamp, 
data string) USING iceberg", tableName);
+    Table table = validationCatalog.loadTable(tableIdent);
+    Assert.assertTrue("Table should start unsorted", 
table.sortOrder().isUnsorted());
+
+    spark.conf().set("spark.sql.caseSensitive", "true");
+    AssertHelpers.assertThrows("Should reject invalid `write ordered` columns",
+            ValidationException.class, "Cannot find field 'CATEGORY' in 
struct",
+        () -> {

Review comment:
       Nit: believe this lambda could fit on one line and there’s no need for 
the brackets.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to