amogh-jahagirdar commented on code in PR #13464:
URL: https://github.com/apache/iceberg/pull/13464#discussion_r2186171096
##########
spark/v3.4/spark/src/test/java/org/apache/iceberg/spark/CatalogTestBase.java:
##########
@@ -41,19 +41,19 @@ protected static Object[][] parameters() {
SparkCatalogConfig.HADOOP.implementation(),
SparkCatalogConfig.HADOOP.properties()
},
- {
- SparkCatalogConfig.SPARK.catalogName(),
- SparkCatalogConfig.SPARK.implementation(),
- SparkCatalogConfig.SPARK.properties()
- },
{
Review Comment:
I'm not sure yet what's going on but I see that when the parameters are
executed in this order the REST catalog is set as the underlying catalog for
the SparkSessionCatalog instead of just the regular SparkCatalog that I'd
expect given this definition.
I'm not sure if there's some weird classloader caching happening in Spark
between the test executions that's leading to that behavior but the reason it
surfaces in the new test is because in spark 3.4/3.5 we can ignore spark
session catalog since it already fails. I just reordered the parameterization
and everything works as expected. Without this reordering the test for REST
fails because it somehow initializes sparksessioncatalog; we want to skip the
expected failure for SparkSessionCatalog since it already fails for a different
reason.
Ultimately, will need to figure out what's really going on here but wanted
to provide context to reviewers for why this change was made.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]