gh-yzou commented on PR #1190:
URL: https://github.com/apache/polaris/pull/1190#issuecomment-2737659194

   @snazy Thanks a lot for the summary! I see your point about release and BOM, 
and build all supported versions by default sounds perfectly fine to me. 
However, for the CI test, i do think it make sense for us to have a separate CI 
test for Spark, because Spark tests may have different requirement on the JVM 
versions for different spark versions, for example, in the iceberg test matrix, 
it has 
   ```
       strategy:
         matrix:
           jvm: [11, 17, 21]
           spark: ['3.4', '3.5']
           scala: ['2.12', '2.13']
           exclude:
             # Spark 3.5 is the first version not failing on Java 21 
(https://issues.apache.org/jira/browse/SPARK-42369)
             # Full Java 21 support is coming in Spark 4 
(https://issues.apache.org/jira/browse/SPARK-43831)
             - jvm: 21
               spark: '3.4'
   ```
   Having a separate CI could make those setup much clear, and also make the 
purpose of each CI workflow much more clear (debugging could in fact become 
easier when test fails). The nessie setup won't block me to do the sub run with 
particular scala version, but i may need a way to disable the spark test for 
the global runs. How about let's do the following:
   1) setup the project to build for all supported scala versions by default 
(in a similar way as nessie)
   2) Introduce a system property -DskipSparkPlugin to allow skip the build for 
spark client
   3) Introduce a separate CI test for spark to be more clear on the testing 
purpose and future spark plugin specific testing extension
   cc @RussellSpitzer 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@polaris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to