wypoon commented on a change in pull request #3430:
URL: https://github.com/apache/iceberg/pull/3430#discussion_r740307049
##########
File path: spark/v3.0/build.gradle
##########
@@ -129,7 +129,6 @@ project(":iceberg-spark:iceberg-spark3-extensions") {
compileOnly project(':iceberg-data')
compileOnly project(':iceberg-orc')
compileOnly project(':iceberg-common')
- compileOnly project(':iceberg-spark')
Review comment:
This and the ones below are not necessary. I removed them so they are
not consistent with the build.gradle for v.31 and v3.2.
##########
File path:
spark/v3.0/spark-extensions/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentAlignmentSupport.scala
##########
@@ -100,7 +99,7 @@ trait AssignmentAlignmentSupport {
case StructType(fields) =>
// build field expressions
val fieldExprs = fields.zipWithIndex.map { case (field, ordinal)
=>
- createAlias(GetStructField(col, ordinal, Some(field.name)),
field.name)
+ Alias(GetStructField(col, ordinal, Some(field.name)),
field.name)()
Review comment:
Ok, I reverted the code changes for spark/v3.0. I kept the change to
spark/v3.0/build.gradle as that is just cleanup.
##########
File path:
spark/v3.1/spark/src/test/java/org/apache/iceberg/spark/sql/TestDeleteFrom.java
##########
@@ -41,70 +38,6 @@ public void removeTables() {
sql("DROP TABLE IF EXISTS %s", tableName);
}
- @Test
- public void testDeleteFromUnpartitionedTable() {
- // This test fails in Spark 3.1. `canDeleteWhere` was added to
`SupportsDelete` in Spark 3.1,
- // but logic to rewrite the query if `canDeleteWhere` returns false was
left to be implemented
- // later.
- Assume.assumeTrue(Spark3VersionUtil.isSpark30());
Review comment:
I added the new tests from v3.2 to the v3.1 `TestDeleteFrom` but the new
tests fail. I'll let @aokolnychyi add correct 3.1 versions if applicable in a
separate change.
##########
File path:
spark/v3.1/spark/src/test/java/org/apache/iceberg/spark/sql/TestDeleteFrom.java
##########
@@ -41,70 +38,6 @@ public void removeTables() {
sql("DROP TABLE IF EXISTS %s", tableName);
}
- @Test
- public void testDeleteFromUnpartitionedTable() {
- // This test fails in Spark 3.1. `canDeleteWhere` was added to
`SupportsDelete` in Spark 3.1,
- // but logic to rewrite the query if `canDeleteWhere` returns false was
left to be implemented
- // later.
- Assume.assumeTrue(Spark3VersionUtil.isSpark30());
Review comment:
I tried adding the new tests from v3.2 to the v3.1 `TestDeleteFrom` but
the new tests fail. I'll let @aokolnychyi add correct 3.1 versions if
applicable in a separate change.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]