Thanks Liang-Chi for the Spark Operator work, and also the discussion here!

For Spark Operator and Connector Go Client, I am guessing they need to
support multiple versions of Spark? e.g. same Spark Operator may support
running multiple versions of Spark, and Connector Go Client might support
multiple versions of Spark driver as well.

How do people think of using the minimum supported Spark version as the
version name for Spark Operator and Connector Go Client? For example,
Spark Operator 3.5.x supports Spark 3.5 and above.

Best,
Bo


On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun <dongj...@apache.org> wrote:

> Ya, that's simple and possible.
>
> However, it may cause many confusions because it implies that new `Spark
> K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic
> Versioning` policy like Apache Spark 4.0.0.
>
> In addition, `Versioning` is directly related to the Release Cadence. It's
> unlikely for us to have `Spark K8s Operator` and `Spark Connect Go`
> releases at every Apache Spark maintenance release. For example, there is
> no commit in Spark Connect Go repository.
>
> I believe the versioning and release cadence is related to those
> subprojects' maturity more.
>
> Dongjoon.
>
> On 2024/04/09 16:59:40 DB Tsai wrote:
> >  Aligning with Spark releases is sensible, as it allows us to guarantee
> that the Spark operator functions correctly with the new version while also
> maintaining support for previous versions.
> >
> > DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> >
> > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan <mri...@gmail.com>
> wrote:
> > >
> > >
> > >   I am trying to understand if we can simply align with Spark's
> version for this ?
> > > Makes the release and jira management much more simpler for developers
> and intuitive for users.
> > >
> > > Regards,
> > > Mridul
> > >
> > >
> > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun <dongj...@apache.org
> <mailto:dongj...@apache.org>> wrote:
> > >> Hi, Liang-Chi.
> > >>
> > >> Thank you for leading Apache Spark K8s operator as a shepherd.
> > >>
> > >> I took a look at `Apache Spark Connect Go` repo mentioned in the
> thread. Sadly, there is no release at all and no activity since last 6
> months. It seems to be the first time for Apache Spark community to
> consider these sister repositories (Go and K8s Operator).
> > >>
> > >> https://github.com/apache/spark-connect-go/commits/master/
> > >>
> > >> Dongjoon.
> > >>
> > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > >> > Hi all,
> > >> >
> > >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > >> > and the first PR is created.
> > >> > Thank you for the review from the community so far.
> > >> >
> > >> > About the versioning of Spark Operator, there are questions.
> > >> >
> > >> > As we are using Spark JIRA, when we are going to merge PRs, we need
> to
> > >> > choose a Spark version. However, the Spark Operator is versioning
> > >> > differently than Spark. I'm wondering how we deal with this?
> > >> >
> > >> > Not sure if Connect also has its versioning different to Spark? If
> so,
> > >> > maybe we can follow how Connect does.
> > >> >
> > >> > Can someone who is familiar with Connect versioning give some
> suggestions?
> > >> >
> > >> > Thank you.
> > >> >
> > >> > Liang-Chi
> > >> >
> > >> >
> ---------------------------------------------------------------------
> > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org <mailto:
> dev-unsubscr...@spark.apache.org>
> > >> >
> > >> >
> > >>
> > >> ---------------------------------------------------------------------
> > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org <mailto:
> dev-unsubscr...@spark.apache.org>
> > >>
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to