openinx commented on a change in pull request #3364:
URL: https://github.com/apache/iceberg/pull/3364#discussion_r736131731



##########
File path: flink/build.gradle
##########
@@ -28,11 +32,11 @@ project(':iceberg-flink') {
     implementation project(':iceberg-parquet')
     implementation project(':iceberg-hive-metastore')
 
-    compileOnly "org.apache.flink:flink-streaming-java_2.12"
-    compileOnly "org.apache.flink:flink-streaming-java_2.12::tests"
-    compileOnly "org.apache.flink:flink-table-api-java-bridge_2.12"
-    compileOnly "org.apache.flink:flink-table-planner-blink_2.12"
-    compileOnly "org.apache.flink:flink-table-planner_2.12"
+    compileOnly "org.apache.flink:flink-streaming-java_2.12:${flinkVersion}"
+    compileOnly 
"org.apache.flink:flink-streaming-java_2.12:${flinkVersion}:tests"
+    compileOnly 
"org.apache.flink:flink-table-api-java-bridge_2.12:${flinkVersion}"
+    compileOnly 
"org.apache.flink:flink-table-planner-blink_2.12:${flinkVersion}"
+    compileOnly "org.apache.flink:flink-table-planner_2.12:${flinkVersion}"

Review comment:
       > then I think we should add multiple Flink modules, for 1.12, 1.13, 
1.14, etc. I'm not sure how we want to manage those. For Spark, we are choosing 
to copy the code from one version to the next so they are independent. That 
would make sense for Flink as well. And because Flink only has 2 supported 
versions at a time, it wouldn't be that much work to maintain.
   
   I agree it's more friendly for iceberg users to maintain iceberg integration 
work against multiple engine versions, but I doubt our current approach to 
accomplish the goal.  The current spark3.0.x and spark3.1.x integration are 
sharing all the code but we are trying to copy the integration code into a 
different module for spark3.1.x,  this kind of code splitting indeed isolate 
the integration difference between different MAJOR.MINOR spark versions. But 
it's really tried to copy the newly introduced feature for every versions.  
Take the [Core: Enable ORC delete writer in the v2 write 
path.](https://github.com/apache/iceberg/pull/3366) feature as an example,  we 
need to enable the ORC delete writer for spark2.4, spark3.0, spark3.2, maybe 
spark3.1 in the future. 
   
   The iceberg integration work for flink 1.12 & flink1.13 are also sharing the 
same code now. Since we can use some methods to shield the subtle differences 
between different minor versions, is it necessary for us to copy the same code 
to complete the test build of the minor version?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to