openinx commented on a change in pull request #4157:
URL: https://github.com/apache/iceberg/pull/4157#discussion_r809713598



##########
File path: flink/v1.14/build.gradle
##########
@@ -17,18 +17,11 @@
  * under the License.
  */
 
-def flinkProjects = [
-  project(':iceberg-flink:iceberg-flink-1.14'),
-  project(':iceberg-flink:iceberg-flink-runtime-1.14')
-]
-
-configure(flinkProjects) {
-  project.ext {
-    flinkVersion = '1.14.0'
-  }
-}
+String flinkVersion = '1.14.0'
+String flinkMajorVersion = '1.14'
+String scalaVersion = System.getProperty("scalaVersion") != null ? 
System.getProperty("scalaVersion") : System.getProperty("defaultScalaVersion")

Review comment:
       Good question, I also considered this issue before. The current 
available scala version for flink are `2.11` and `2.12`,  for most of people as 
far I know,  they mainly choose scala 2.11 for the production environment.  But 
the available scala versions for spark are `2.11`, `2.12`, `2.13` ( Some major 
version only support 2.12).
   
   So in fact, the supported scala version ranges are quite different between 
spark & flink.  Besides, the supported scala version ranges are also different 
between different spark versions.  Let's see the following table: 
   
   * https://mvnrepository.com/artifact/org.apache.spark/spark-sql
   * https://mvnrepository.com/artifact/org.apache.flink/flink-scala
   
   So even if we separate the scala version between spark & flink, the same 
scala version still could not be applied to build all the iceberg spark runtime 
jars I think. The correct approach is: try to package related artifacts in one 
build process as much as possible. If we can't build in one build, we can only 
choose to build again.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to