GitHub user PHILO-HE edited a discussion: Adopt a convention for upgrading 
Spark to a minor release (3rd release number)

Currently, maven profile is used to customize Gluten's build with selective 
code modules and dependencies for different Spark releases. And only one Spark 
minor release is picked as the officially support version in each release line. 
E.g., currently, Spark-3.4.3 is officially supported in Spark-3.4 release line. 
Previously, we can always upgrade Spark to its latest minor release, which 
possibly requires some Gluten code to be changed to adapt to that. But with 
Iceberg/Hudi introduced into Gluten, the situation is different. One release of 
Iceberg/Hudi only officially supports some certain Spark releases (this is 
quite like Gluten). The direct upgradation for Spark can break the use of 
Iceberg or Hudi.

So when upgrading Spark, we should make sure the currently bound Iceberg/Hudi 
supports the new Spark release (this may be possible even though Iceberg/Hudi 
doesn't officially supports it), or we may have to upgrade Iceberg/Hudi 
meanwhile to guarantee that. For the latter choice, if only their next release 
meets our requirement, we may be blocked until new Iceberg/Hudi release comes. 
Then, this breaks our previous convention in the community, i.e., move quickly 
to always support latest Spark minor release.

Any suggestion is welcome!

GitHub link: https://github.com/apache/incubator-gluten/discussions/7874

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to