wypoon commented on pull request #2512:
URL: https://github.com/apache/iceberg/pull/2512#issuecomment-838946848


   @RussellSpitzer I tried out your build.gradle. Unfortunately, it appears 
that an Iceberg Spark3 runtime jar built against Spark 3.0 does not completely 
work with Spark 3.1. The integration test, `SmokeTest` fails in 
`testGettingStarted` when running a `MERGE INTO`, due to incompatible 
signatures in the Spark case class `Alias`:
   ```
   java.lang.NoSuchMethodError: 
org.apache.spark.sql.catalyst.expressions.Alias.<init>(Lorg/apache/spark/sql/catalyst/expressions/Expression;Ljava/lang/String;Lorg/apache/spark/sql/catalyst/expressions/ExprId;Lscala/collection/Seq;Lscala/Option;)V
        at 
org.apache.spark.sql.catalyst.optimizer.RewriteMergeInto$$anonfun$apply$1.applyOrElse(RewriteMergeInto.scala:156)
   ```
   This is because the case class added an additional parameter in 3.1 vs 3.0:
   
https://github.com/apache/spark/blob/branch-3.0/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/namedExpressions.scala#L146-L150
   
https://github.com/apache/spark/blob/branch-3.1/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/namedExpressions.scala#L149-L154
   
   @rdblue I'm going to stick with my approach, introduce new modules to build 
and test spark3 and spark3-extensions against Spark 3.1, and also to introduce 
a new runtime module for Spark 3.1. We will produce two Spark3 runtime jars 
explicitly, one for 3.0 and one for 3.1. I won't use a build property.
    


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to