aokolnychyi edited a comment on pull request #3256:
URL: https://github.com/apache/iceberg/pull/3256#issuecomment-940455683


   > 1. should we continue to keep the existing iceberg-spark module as a 
common module across all versions
   
   While I am generally against sharing code between any versions of Spark, I 
am 50/50 here. We could probably have a common module for very basic things but 
then we won't compile that code against the exact Spark version we will be 
running with. Since the main development will be done against the most recent 
version, I'd slightly prefer duplicating the classes as then we won't have to 
reason how public and stable Spark API, which are used in that common module, 
are.
   
   > The definition is that build version is the actual version we build Spark, 
and source version is the smallest version that would be forward compatible 
until we create the next source version.
   
   I'd vote for no compatibility between minor versions, only patch releases. 
We tried that in the past with 3.0 and 3.1 and it slows down the development 
and makes the life much harder. We are using Spark APIs that change frequently. 
I'd say having a module for each minor Spark version would be better. I think 
there will be major features in each upcoming minor Spark version that we will 
want to consume ASAP.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to