rdblue commented on a change in pull request #3430:
URL: https://github.com/apache/iceberg/pull/3430#discussion_r740337346



##########
File path: 
spark/v3.0/spark-extensions/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentAlignmentSupport.scala
##########
@@ -100,7 +99,7 @@ trait AssignmentAlignmentSupport {
             case StructType(fields) =>
               // build field expressions
               val fieldExprs = fields.zipWithIndex.map { case (field, ordinal) 
=>
-                createAlias(GetStructField(col, ordinal, Some(field.name)), 
field.name)
+                Alias(GetStructField(col, ordinal, Some(field.name)), 
field.name)()

Review comment:
       I think that we should leave the reflection in 3.0. We aren't renaming 
the `iceberg-spark3-runtime` Jar so it is possible that people currently using 
it for 3.1 will continue to. As long as we can maintain compatibility with that 
by doing nothing, I think we should. I don't think that it is necessary to do 
much else, but I don't see a reason to remove the compatibility.

##########
File path: 
spark/v3.1/spark/src/test/java/org/apache/iceberg/spark/sql/TestDeleteFrom.java
##########
@@ -41,70 +38,6 @@ public void removeTables() {
     sql("DROP TABLE IF EXISTS %s", tableName);
   }
 
-  @Test
-  public void testDeleteFromUnpartitionedTable() {
-    // This test fails in Spark 3.1. `canDeleteWhere` was added to 
`SupportsDelete` in Spark 3.1,
-    // but logic to rewrite the query if `canDeleteWhere` returns false was 
left to be implemented
-    // later.
-    Assume.assumeTrue(Spark3VersionUtil.isSpark30());

Review comment:
       I think Anton added new tests in place of these in 3.2. Can you check 
whether they should be added here?

##########
File path: 
spark/v3.0/spark-extensions/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentAlignmentSupport.scala
##########
@@ -100,7 +99,7 @@ trait AssignmentAlignmentSupport {
             case StructType(fields) =>
               // build field expressions
               val fieldExprs = fields.zipWithIndex.map { case (field, ordinal) 
=>
-                createAlias(GetStructField(col, ordinal, Some(field.name)), 
field.name)
+                Alias(GetStructField(col, ordinal, Some(field.name)), 
field.name)()

Review comment:
       Thanks!

##########
File path: spark/v3.0/build.gradle
##########
@@ -129,7 +129,6 @@ project(":iceberg-spark:iceberg-spark3-extensions") {
     compileOnly project(':iceberg-data')
     compileOnly project(':iceberg-orc')
     compileOnly project(':iceberg-common')
-    compileOnly project(':iceberg-spark')

Review comment:
       Sounds good to me.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to