Summary of the feedback from the mailing list, developers/users WeChat groups:
- a few users still use Spark 3.1 with the recent Kyuubi versions but only use the Spark engine - a few users use Spark 3.2 with recent Kyuubi versions, both the Spark engine and Spark extensions are used - most users tend to use a recent Kyuubi version combined with one of Spark 3.3, 3.4, or 3.5 - some users stay with both the old Kyuubi version and the old Spark version Based on the survey result, I’d like to keep my original proposal: Users still can use Kyuubi Spark SQL engine on 1.9.0, but will get a deprecated message. While Kyuubi Spark extensions would NOT work with Spark 3.1 on 1.9.0. >> We probably should avoid dropping critical kinds of stuff in the current >> branch >> being developed, especially in the midway. The proper window should be within >> 1~2 week before or after the last release. If there's no rush, a version >> ahead. I’m not sure if this is a good or over strict idea. I would accept major changes anytime as long as it get approval from the community core developers. And one advantage of making major changes in the midway is, that it has more chance to be verified by developers or early adopters before being delivered to end users via an official release. Thanks, Cheng Pan > On Feb 26, 2024, at 17:40, Cheng Pan <pan3...@gmail.com> wrote: > > To be clear, I only list one option… the 1st item is for extensions, and the > 2nd is for the engine ... > > Thanks, > Cheng Pan > > >> On Feb 26, 2024, at 17:33, Kent Yao <y...@apache.org> wrote: >> >> We probably should avoid dropping critical kinds of stuff in the current >> branch >> being developed, especially in the midway. The proper window should be within >> 1~2 week before or after the last release. If there's no rush, a version >> ahead. >> >> In this case, I'm +1 for opt.2 >> >> Kent Yao <y...@apache.org> 于2024年2月26日周一 17:22写道: >>> >>> For engines or extension? >>> >>> Cheng Pan <pan3...@gmail.com> 于2024年2月26日周一 17:15写道: >>>> >>>> Hi, Kyuubi developers, >>>> >>>> The Spark 3.1 was EOL[1] on 02/18/2022 with the latest patch release 3.1.3. >>>> The Spark 3.2 was EOL[2] on 04/13/2023 with the latest patch release 3.2.4. >>>> The Spark 3.3 was EOL[2] on 08/21/2023 with the latest patch release 3.3.4. >>>> >>>> Given the fact that Spark 3.1 is the first widely adopted version of Spark >>>> 3.x series, we keep the support of it for a long time. >>>> >>>> Now, I propose to deprecate the support for Spark 3.1(or maybe include >>>> 3.2) in Kyuubi 1.9.0, in detail: >>>> >>>> - Remove support of Spark 3.1(or 3.1 and 3.2) in all extensions in 1.9.0 >>>> - Keep the support of Spark 3.1(or 3.1 and 3.2) in the Spark SQL engine in >>>> 1.9.0, and remove the support after 1.9.0 (maybe 1.10.0 or 2.0.0) >>>> >>>> Any thoughts? >>>> >>>> [1] https://github.com/apache/spark-website/pull/425 >>>> [2] https://github.com/apache/spark-website/pull/500 >>>> >>>> Thanks, >>>> Cheng Pan >>>> >>>> >