amogh-jahagirdar commented on code in PR #13464:
URL: https://github.com/apache/iceberg/pull/13464#discussion_r2186171096


##########
spark/v3.4/spark/src/test/java/org/apache/iceberg/spark/CatalogTestBase.java:
##########
@@ -41,19 +41,19 @@ protected static Object[][] parameters() {
         SparkCatalogConfig.HADOOP.implementation(),
         SparkCatalogConfig.HADOOP.properties()
       },
-      {
-        SparkCatalogConfig.SPARK.catalogName(),
-        SparkCatalogConfig.SPARK.implementation(),
-        SparkCatalogConfig.SPARK.properties()
-      },
       {

Review Comment:
   I'm not sure yet what's going on but  I see that when the parameters are 
executed in this order the REST catalog is set as the underlying catalog for 
the SparkSessionCatalog instead of just the regular SparkCatalog that I'd 
expect given this definition. 
   
   I'm not sure if there's some weird classloader caching  happening in Spark 
between the test executions  that's leading to that behavior but the reason it 
surfaces in the new test  is because in spark 3.4/3.5 we can ignore spark 
session catalog since it already fails but for an expected different message 
(so we want to skip that case). I just reordered the parameterization and 
everything works as expected.  
   
   Ultimately, will need to figure out what's really going on here but wanted 
to provide context to reviewers for why this change was made.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to