rdblue commented on a change in pull request #3256:
URL: https://github.com/apache/iceberg/pull/3256#discussion_r726601852



##########
File path: settings.gradle
##########
@@ -55,21 +52,44 @@ project(':arrow').name = 'iceberg-arrow'
 project(':parquet').name = 'iceberg-parquet'
 project(':bundled-guava').name = 'iceberg-bundled-guava'
 project(':spark').name = 'iceberg-spark'
-project(':spark3').name = 'iceberg-spark3'
-project(':spark3-extensions').name = 'iceberg-spark3-extensions'
-project(':spark3-runtime').name = 'iceberg-spark3-runtime'
 project(':pig').name = 'iceberg-pig'
 project(':hive-metastore').name = 'iceberg-hive-metastore'
 project(':nessie').name = 'iceberg-nessie'
 
+String defaultSparkVersions = "2.4,3.0"
+List<String> knownSparkVersions = ["2.4", "3.0"]
+List<String> sparkVersions = (System.getProperty("sparkVersions") != null ? 
System.getProperty("sparkVersions") : defaultSparkVersions).split(",")
+
+if (!knownSparkVersions.containsAll(sparkVersions)) {
+  throw new GradleException("Found unsupported Spark versions: " + 
(sparkVersions - knownSparkVersions))
+}
+
+if (sparkVersions.contains("3.0")) {

Review comment:
       Let me think about this a bit more. I think that we should be able to 
run the tests and integration tests against 3.1. I think the easy way to do 
that is to introduce a 3.1 build directory. Maybe it can share source with 3.0 
though? I just want to avoid having extra ways of calling tests for just one 
build of Spark. Ideally, we'd add actions to test each version of Spark 
individually.

##########
File path: settings.gradle
##########
@@ -55,21 +52,44 @@ project(':arrow').name = 'iceberg-arrow'
 project(':parquet').name = 'iceberg-parquet'
 project(':bundled-guava').name = 'iceberg-bundled-guava'
 project(':spark').name = 'iceberg-spark'
-project(':spark3').name = 'iceberg-spark3'
-project(':spark3-extensions').name = 'iceberg-spark3-extensions'
-project(':spark3-runtime').name = 'iceberg-spark3-runtime'
 project(':pig').name = 'iceberg-pig'
 project(':hive-metastore').name = 'iceberg-hive-metastore'
 project(':nessie').name = 'iceberg-nessie'
 
+String defaultSparkVersions = "2.4,3.0"
+List<String> knownSparkVersions = ["2.4", "3.0"]
+List<String> sparkVersions = (System.getProperty("sparkVersions") != null ? 
System.getProperty("sparkVersions") : defaultSparkVersions).split(",")
+
+if (!knownSparkVersions.containsAll(sparkVersions)) {
+  throw new GradleException("Found unsupported Spark versions: " + 
(sparkVersions - knownSparkVersions))
+}
+
+if (sparkVersions.contains("3.0")) {

Review comment:
       Let me think about this a bit more. I think that we should be able to 
run the tests and integration tests against 3.1. I think the easy way to do 
that is to introduce a 3.1 build directory. Maybe it can share source with 3.0 
though? I just want to avoid having extra ways of calling tests for just one 
build of Spark. Ideally, we'd add on set of actions to test each version of 
Spark individually.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to