Re: [External] Re: Versioning of Spark Operator

2024-04-11 Thread Ofir Manor
A related question - what is the expected release cadence? At least for the 
next 12-18 months?
Since this is a new subproject, I am personally hoping it would have a faster 
cadence at first, maybe one a month or once every couple of months... If so, 
that would affect versioning.
Also, if it uses semantic versioning, since it is early for the subproject it 
might have a few releases with breaking changes until its own API, defaults, 
behavior becomes stable, so again, having its own versioning might help.
Just my two cents,
   Ofir

From: L. C. Hsieh 
Sent: Wednesday, April 10, 2024 6:14 PM
To: Dongjoon Hyun 
Cc: dev@spark.apache.org 
Subject: [External] Re: Versioning of Spark Operator

This approach makes sense to me.

If Spark K8s operator is aligned with Spark versions, for example, it
uses 4.0.0 now.
Because these JIRA tickets are not actually targeting Spark 4.0.0, it
will cause confusion and more questions, like when we are going to cut
Spark release,
should we include Spark operator JIRAs in the release note, etc.

So I think an independent version number for Spark K8s operator would
be a better option.

If there are no more options or comments, I will create a vote later
to create new "Versions" in Apache Spark JIRA.

Thank you all.

On Wed, Apr 10, 2024 at 12:20 AM Dongjoon Hyun  wrote:
>
> Ya, that would work.
>
> Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo.
>
> It looks reasonable to me.
>
> Although they share the same JIRA, they choose different patterns per place.
>
> 1. In POM file and Maven Artifact, independent version number.
> 1.8.0
>
> 2. Tag is also based on the independent version number
> https://github.com/apache/flink-kubernetes-operator/tags
> - release-1.8.0
> - release-1.7.0
>
> 3. JIRA Fixed Version is `kubernetes-operator-` prefix.
> https://issues.apache.org/jira/browse/FLINK-34957
> > Fix Version/s: kubernetes-operator-1.9.0
>
> Maybe, we can borrow this pattern.
>
> I guess we need a vote for any further decision because we need to create new 
> `Versions` in Apache Spark JIRA.
>
> Dongjoon.
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-10 Thread L. C. Hsieh
This approach makes sense to me.

If Spark K8s operator is aligned with Spark versions, for example, it
uses 4.0.0 now.
Because these JIRA tickets are not actually targeting Spark 4.0.0, it
will cause confusion and more questions, like when we are going to cut
Spark release,
should we include Spark operator JIRAs in the release note, etc.

So I think an independent version number for Spark K8s operator would
be a better option.

If there are no more options or comments, I will create a vote later
to create new "Versions" in Apache Spark JIRA.

Thank you all.

On Wed, Apr 10, 2024 at 12:20 AM Dongjoon Hyun  wrote:
>
> Ya, that would work.
>
> Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo.
>
> It looks reasonable to me.
>
> Although they share the same JIRA, they choose different patterns per place.
>
> 1. In POM file and Maven Artifact, independent version number.
> 1.8.0
>
> 2. Tag is also based on the independent version number
> https://github.com/apache/flink-kubernetes-operator/tags
> - release-1.8.0
> - release-1.7.0
>
> 3. JIRA Fixed Version is `kubernetes-operator-` prefix.
> https://issues.apache.org/jira/browse/FLINK-34957
> > Fix Version/s: kubernetes-operator-1.9.0
>
> Maybe, we can borrow this pattern.
>
> I guess we need a vote for any further decision because we need to create new 
> `Versions` in Apache Spark JIRA.
>
> Dongjoon.
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-10 Thread bo yang
Cool, looks like we have two options here.

Option 1: Spark Operator and Connect Go Client versioning independent of
Spark, e.g. starting with 0.1.0.
Pros: they can evolve versions independently.
Cons: people will need an extra step to decide the version when using Spark
Operator and Connect Go Client.

Option 2: Spark Operator and Connect Go Client versioning loosely related
with Spark, e.g. starting with the Supported Spark version
Pros: might be easy for beginning users to choose version when using Spark
Operator and Connect Go Client.
Cons: there is uncertainty how the compatibility will go in the future for
Spark Operator and Connect Go Client regarding Spark, which may impact this
version naming.

Right now, Connect Go Client uses Option 2, but can change to Option 1 if
needed.


On Wed, Apr 10, 2024 at 6:19 AM Dongjoon Hyun 
wrote:

> Ya, that would work.
>
> Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo.
>
> It looks reasonable to me.
>
> Although they share the same JIRA, they choose different patterns per
> place.
>
> 1. In POM file and Maven Artifact, independent version number.
> 1.8.0
>
> 2. Tag is also based on the independent version number
> https://github.com/apache/flink-kubernetes-operator/tags
> - release-1.8.0
> - release-1.7.0
>
> 3. JIRA Fixed Version is `kubernetes-operator-` prefix.
> https://issues.apache.org/jira/browse/FLINK-34957
> > Fix Version/s: kubernetes-operator-1.9.0
>
> Maybe, we can borrow this pattern.
>
> I guess we need a vote for any further decision because we need to create
> new `Versions` in Apache Spark JIRA.
>
> Dongjoon.
>
>


Re: Versioning of Spark Operator

2024-04-10 Thread Dongjoon Hyun
Ya, that would work.

Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo.

It looks reasonable to me.

Although they share the same JIRA, they choose different patterns per place.

1. In POM file and Maven Artifact, independent version number.
1.8.0

2. Tag is also based on the independent version number
https://github.com/apache/flink-kubernetes-operator/tags
- release-1.8.0
- release-1.7.0

3. JIRA Fixed Version is `kubernetes-operator-` prefix.
https://issues.apache.org/jira/browse/FLINK-34957
> Fix Version/s: kubernetes-operator-1.9.0

Maybe, we can borrow this pattern.

I guess we need a vote for any further decision because we need to create
new `Versions` in Apache Spark JIRA.

Dongjoon.


Re: Versioning of Spark Operator

2024-04-10 Thread L. C. Hsieh
Yea, I guess, for example, the first release of Spark K8s Operator
would be something like 0.1.0 instead of 4.0.0.

It sounds hard to align with Spark versions because of that?


On Tue, Apr 9, 2024 at 10:15 AM Dongjoon Hyun  wrote:
>
> Ya, that's simple and possible.
>
> However, it may cause many confusions because it implies that new `Spark K8s 
> Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic 
> Versioning` policy like Apache Spark 4.0.0.
>
> In addition, `Versioning` is directly related to the Release Cadence. It's 
> unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` releases 
> at every Apache Spark maintenance release. For example, there is no commit in 
> Spark Connect Go repository.
>
> I believe the versioning and release cadence is related to those subprojects' 
> maturity more.
>
> Dongjoon.
>
> On 2024/04/09 16:59:40 DB Tsai wrote:
> >  Aligning with Spark releases is sensible, as it allows us to guarantee 
> > that the Spark operator functions correctly with the new version while also 
> > maintaining support for previous versions.
> >
> > DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> >
> > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan  wrote:
> > >
> > >
> > >   I am trying to understand if we can simply align with Spark's version 
> > > for this ?
> > > Makes the release and jira management much more simpler for developers 
> > > and intuitive for users.
> > >
> > > Regards,
> > > Mridul
> > >
> > >
> > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > > > wrote:
> > >> Hi, Liang-Chi.
> > >>
> > >> Thank you for leading Apache Spark K8s operator as a shepherd.
> > >>
> > >> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. 
> > >> Sadly, there is no release at all and no activity since last 6 months. 
> > >> It seems to be the first time for Apache Spark community to consider 
> > >> these sister repositories (Go and K8s Operator).
> > >>
> > >> https://github.com/apache/spark-connect-go/commits/master/
> > >>
> > >> Dongjoon.
> > >>
> > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > >> > Hi all,
> > >> >
> > >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > >> > and the first PR is created.
> > >> > Thank you for the review from the community so far.
> > >> >
> > >> > About the versioning of Spark Operator, there are questions.
> > >> >
> > >> > As we are using Spark JIRA, when we are going to merge PRs, we need to
> > >> > choose a Spark version. However, the Spark Operator is versioning
> > >> > differently than Spark. I'm wondering how we deal with this?
> > >> >
> > >> > Not sure if Connect also has its versioning different to Spark? If so,
> > >> > maybe we can follow how Connect does.
> > >> >
> > >> > Can someone who is familiar with Connect versioning give some 
> > >> > suggestions?
> > >> >
> > >> > Thank you.
> > >> >
> > >> > Liang-Chi
> > >> >
> > >> > -
> > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> > >> > 
> > >> >
> > >> >
> > >>
> > >> -
> > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> > >> 
> > >>
> >
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-09 Thread L. C. Hsieh
For Spark Operator, I think the answer is yes. According to my
impression, Spark Operator should be Spark version-agnostic. Zhou,
please correct me if I'm wrong.
I am not sure about the Spark Connector Go client, but if it is going
to talk with Spark cluster, I guess it should be still related to
Spark version (there is compatible issue).


> On 2024/04/09 21:35:45 bo yang wrote:
> > Thanks Liang-Chi for the Spark Operator work, and also the discussion here!
> >
> > For Spark Operator and Connector Go Client, I am guessing they need to
> > support multiple versions of Spark? e.g. same Spark Operator may support
> > running multiple versions of Spark, and Connector Go Client might support
> > multiple versions of Spark driver as well.
> >
> > How do people think of using the minimum supported Spark version as the
> > version name for Spark Operator and Connector Go Client? For example,
> > Spark Operator 3.5.x supports Spark 3.5 and above.
> >
> > Best,
> > Bo
> >
> >
> > On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun  wrote:
> >
> > > Ya, that's simple and possible.
> > >
> > > However, it may cause many confusions because it implies that new `Spark
> > > K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic
> > > Versioning` policy like Apache Spark 4.0.0.
> > >
> > > In addition, `Versioning` is directly related to the Release Cadence. It's
> > > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go`
> > > releases at every Apache Spark maintenance release. For example, there is
> > > no commit in Spark Connect Go repository.
> > >
> > > I believe the versioning and release cadence is related to those
> > > subprojects' maturity more.
> > >
> > > Dongjoon.
> > >
> > > On 2024/04/09 16:59:40 DB Tsai wrote:
> > > >  Aligning with Spark releases is sensible, as it allows us to guarantee
> > > that the Spark operator functions correctly with the new version while 
> > > also
> > > maintaining support for previous versions.
> > > >
> > > > DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> > > >
> > > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan 
> > > wrote:
> > > > >
> > > > >
> > > > >   I am trying to understand if we can simply align with Spark's
> > > version for this ?
> > > > > Makes the release and jira management much more simpler for developers
> > > and intuitive for users.
> > > > >
> > > > > Regards,
> > > > > Mridul
> > > > >
> > > > >
> > > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > > > wrote:
> > > > >> Hi, Liang-Chi.
> > > > >>
> > > > >> Thank you for leading Apache Spark K8s operator as a shepherd.
> > > > >>
> > > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the
> > > thread. Sadly, there is no release at all and no activity since last 6
> > > months. It seems to be the first time for Apache Spark community to
> > > consider these sister repositories (Go and K8s Operator).
> > > > >>
> > > > >> https://github.com/apache/spark-connect-go/commits/master/
> > > > >>
> > > > >> Dongjoon.
> > > > >>
> > > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > > > >> > Hi all,
> > > > >> >
> > > > >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > > > >> > and the first PR is created.
> > > > >> > Thank you for the review from the community so far.
> > > > >> >
> > > > >> > About the versioning of Spark Operator, there are questions.
> > > > >> >
> > > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need
> > > to
> > > > >> > choose a Spark version. However, the Spark Operator is versioning
> > > > >> > differently than Spark. I'm wondering how we deal with this?
> > > > >> >
> > > > >> > Not sure if Connect also has its versioning different to Spark? If
> > > so,
> > > > >> > maybe we can follow how Connect does.
> > > > >> >
> > > > >> > Can someone who is familiar with Connect versioning give some
> > > suggestions?
> > > > >> >
> > > > >> > Thank you.
> > > > >> >
> > > > >> > Liang-Chi
> > > > >> >
> > > > >> >
> > > -
> > > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  > > dev-unsubscr...@spark.apache.org>
> > > > >> >
> > > > >> >
> > > > >>
> > > > >> -
> > > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  > > dev-unsubscr...@spark.apache.org>
> > > > >>
> > > >
> > > >
> > >
> > > -
> > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >
> > >
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-09 Thread Dongjoon Hyun
Do we have a compatibility matrix of Apache Connect Go client already, Bo?

Specifically, I'm wondering which versions the existing Apache Spark Connect Go 
repository is able to support as of now.

We know that it is supposed to be compatible always, but do we have a way to 
verify that actually via CI to make it sure inside Go repository?

Dongjoon.

On 2024/04/09 21:35:45 bo yang wrote:
> Thanks Liang-Chi for the Spark Operator work, and also the discussion here!
> 
> For Spark Operator and Connector Go Client, I am guessing they need to
> support multiple versions of Spark? e.g. same Spark Operator may support
> running multiple versions of Spark, and Connector Go Client might support
> multiple versions of Spark driver as well.
> 
> How do people think of using the minimum supported Spark version as the
> version name for Spark Operator and Connector Go Client? For example,
> Spark Operator 3.5.x supports Spark 3.5 and above.
> 
> Best,
> Bo
> 
> 
> On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun  wrote:
> 
> > Ya, that's simple and possible.
> >
> > However, it may cause many confusions because it implies that new `Spark
> > K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic
> > Versioning` policy like Apache Spark 4.0.0.
> >
> > In addition, `Versioning` is directly related to the Release Cadence. It's
> > unlikely for us to have `Spark K8s Operator` and `Spark Connect Go`
> > releases at every Apache Spark maintenance release. For example, there is
> > no commit in Spark Connect Go repository.
> >
> > I believe the versioning and release cadence is related to those
> > subprojects' maturity more.
> >
> > Dongjoon.
> >
> > On 2024/04/09 16:59:40 DB Tsai wrote:
> > >  Aligning with Spark releases is sensible, as it allows us to guarantee
> > that the Spark operator functions correctly with the new version while also
> > maintaining support for previous versions.
> > >
> > > DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> > >
> > > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan 
> > wrote:
> > > >
> > > >
> > > >   I am trying to understand if we can simply align with Spark's
> > version for this ?
> > > > Makes the release and jira management much more simpler for developers
> > and intuitive for users.
> > > >
> > > > Regards,
> > > > Mridul
> > > >
> > > >
> > > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > > wrote:
> > > >> Hi, Liang-Chi.
> > > >>
> > > >> Thank you for leading Apache Spark K8s operator as a shepherd.
> > > >>
> > > >> I took a look at `Apache Spark Connect Go` repo mentioned in the
> > thread. Sadly, there is no release at all and no activity since last 6
> > months. It seems to be the first time for Apache Spark community to
> > consider these sister repositories (Go and K8s Operator).
> > > >>
> > > >> https://github.com/apache/spark-connect-go/commits/master/
> > > >>
> > > >> Dongjoon.
> > > >>
> > > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > > >> > Hi all,
> > > >> >
> > > >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > > >> > and the first PR is created.
> > > >> > Thank you for the review from the community so far.
> > > >> >
> > > >> > About the versioning of Spark Operator, there are questions.
> > > >> >
> > > >> > As we are using Spark JIRA, when we are going to merge PRs, we need
> > to
> > > >> > choose a Spark version. However, the Spark Operator is versioning
> > > >> > differently than Spark. I'm wondering how we deal with this?
> > > >> >
> > > >> > Not sure if Connect also has its versioning different to Spark? If
> > so,
> > > >> > maybe we can follow how Connect does.
> > > >> >
> > > >> > Can someone who is familiar with Connect versioning give some
> > suggestions?
> > > >> >
> > > >> > Thank you.
> > > >> >
> > > >> > Liang-Chi
> > > >> >
> > > >> >
> > -
> > > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  > dev-unsubscr...@spark.apache.org>
> > > >> >
> > > >> >
> > > >>
> > > >> -
> > > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  > dev-unsubscr...@spark.apache.org>
> > > >>
> > >
> > >
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
> >
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-09 Thread bo yang
Thanks Liang-Chi for the Spark Operator work, and also the discussion here!

For Spark Operator and Connector Go Client, I am guessing they need to
support multiple versions of Spark? e.g. same Spark Operator may support
running multiple versions of Spark, and Connector Go Client might support
multiple versions of Spark driver as well.

How do people think of using the minimum supported Spark version as the
version name for Spark Operator and Connector Go Client? For example,
Spark Operator 3.5.x supports Spark 3.5 and above.

Best,
Bo


On Tue, Apr 9, 2024 at 10:14 AM Dongjoon Hyun  wrote:

> Ya, that's simple and possible.
>
> However, it may cause many confusions because it implies that new `Spark
> K8s Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic
> Versioning` policy like Apache Spark 4.0.0.
>
> In addition, `Versioning` is directly related to the Release Cadence. It's
> unlikely for us to have `Spark K8s Operator` and `Spark Connect Go`
> releases at every Apache Spark maintenance release. For example, there is
> no commit in Spark Connect Go repository.
>
> I believe the versioning and release cadence is related to those
> subprojects' maturity more.
>
> Dongjoon.
>
> On 2024/04/09 16:59:40 DB Tsai wrote:
> >  Aligning with Spark releases is sensible, as it allows us to guarantee
> that the Spark operator functions correctly with the new version while also
> maintaining support for previous versions.
> >
> > DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> >
> > > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan 
> wrote:
> > >
> > >
> > >   I am trying to understand if we can simply align with Spark's
> version for this ?
> > > Makes the release and jira management much more simpler for developers
> and intuitive for users.
> > >
> > > Regards,
> > > Mridul
> > >
> > >
> > > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > wrote:
> > >> Hi, Liang-Chi.
> > >>
> > >> Thank you for leading Apache Spark K8s operator as a shepherd.
> > >>
> > >> I took a look at `Apache Spark Connect Go` repo mentioned in the
> thread. Sadly, there is no release at all and no activity since last 6
> months. It seems to be the first time for Apache Spark community to
> consider these sister repositories (Go and K8s Operator).
> > >>
> > >> https://github.com/apache/spark-connect-go/commits/master/
> > >>
> > >> Dongjoon.
> > >>
> > >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > >> > Hi all,
> > >> >
> > >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > >> > and the first PR is created.
> > >> > Thank you for the review from the community so far.
> > >> >
> > >> > About the versioning of Spark Operator, there are questions.
> > >> >
> > >> > As we are using Spark JIRA, when we are going to merge PRs, we need
> to
> > >> > choose a Spark version. However, the Spark Operator is versioning
> > >> > differently than Spark. I'm wondering how we deal with this?
> > >> >
> > >> > Not sure if Connect also has its versioning different to Spark? If
> so,
> > >> > maybe we can follow how Connect does.
> > >> >
> > >> > Can someone who is familiar with Connect versioning give some
> suggestions?
> > >> >
> > >> > Thank you.
> > >> >
> > >> > Liang-Chi
> > >> >
> > >> >
> -
> > >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  dev-unsubscr...@spark.apache.org>
> > >> >
> > >> >
> > >>
> > >> -
> > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org  dev-unsubscr...@spark.apache.org>
> > >>
> >
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Versioning of Spark Operator

2024-04-09 Thread Dongjoon Hyun
Ya, that's simple and possible.

However, it may cause many confusions because it implies that new `Spark K8s 
Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic 
Versioning` policy like Apache Spark 4.0.0.

In addition, `Versioning` is directly related to the Release Cadence. It's 
unlikely for us to have `Spark K8s Operator` and `Spark Connect Go` releases at 
every Apache Spark maintenance release. For example, there is no commit in 
Spark Connect Go repository.

I believe the versioning and release cadence is related to those subprojects' 
maturity more.

Dongjoon.

On 2024/04/09 16:59:40 DB Tsai wrote:
>  Aligning with Spark releases is sensible, as it allows us to guarantee that 
> the Spark operator functions correctly with the new version while also 
> maintaining support for previous versions.
>  
> DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1
> 
> > On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan  wrote:
> > 
> > 
> >   I am trying to understand if we can simply align with Spark's version for 
> > this ?
> > Makes the release and jira management much more simpler for developers and 
> > intuitive for users.
> > 
> > Regards,
> > Mridul
> > 
> > 
> > On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > > wrote:
> >> Hi, Liang-Chi.
> >> 
> >> Thank you for leading Apache Spark K8s operator as a shepherd. 
> >> 
> >> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. 
> >> Sadly, there is no release at all and no activity since last 6 months. It 
> >> seems to be the first time for Apache Spark community to consider these 
> >> sister repositories (Go and K8s Operator).
> >> 
> >> https://github.com/apache/spark-connect-go/commits/master/
> >> 
> >> Dongjoon.
> >> 
> >> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> >> > Hi all,
> >> > 
> >> > We've opened the dedicated repository of Spark Kubernetes Operator,
> >> > and the first PR is created.
> >> > Thank you for the review from the community so far.
> >> > 
> >> > About the versioning of Spark Operator, there are questions.
> >> > 
> >> > As we are using Spark JIRA, when we are going to merge PRs, we need to
> >> > choose a Spark version. However, the Spark Operator is versioning
> >> > differently than Spark. I'm wondering how we deal with this?
> >> > 
> >> > Not sure if Connect also has its versioning different to Spark? If so,
> >> > maybe we can follow how Connect does.
> >> > 
> >> > Can someone who is familiar with Connect versioning give some 
> >> > suggestions?
> >> > 
> >> > Thank you.
> >> > 
> >> > Liang-Chi
> >> > 
> >> > -
> >> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> >> > 
> >> > 
> >> > 
> >> 
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> >> 
> >> 
> 
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Versioning of Spark Operator

2024-04-09 Thread DB Tsai
 Aligning with Spark releases is sensible, as it allows us to guarantee that 
the Spark operator functions correctly with the new version while also 
maintaining support for previous versions.
 
DB Tsai  |  https://www.dbtsai.com/  |  PGP 42E5B25A8F7A82C1

> On Apr 9, 2024, at 9:45 AM, Mridul Muralidharan  wrote:
> 
> 
>   I am trying to understand if we can simply align with Spark's version for 
> this ?
> Makes the release and jira management much more simpler for developers and 
> intuitive for users.
> 
> Regards,
> Mridul
> 
> 
> On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  > wrote:
>> Hi, Liang-Chi.
>> 
>> Thank you for leading Apache Spark K8s operator as a shepherd. 
>> 
>> I took a look at `Apache Spark Connect Go` repo mentioned in the thread. 
>> Sadly, there is no release at all and no activity since last 6 months. It 
>> seems to be the first time for Apache Spark community to consider these 
>> sister repositories (Go and K8s Operator).
>> 
>> https://github.com/apache/spark-connect-go/commits/master/
>> 
>> Dongjoon.
>> 
>> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
>> > Hi all,
>> > 
>> > We've opened the dedicated repository of Spark Kubernetes Operator,
>> > and the first PR is created.
>> > Thank you for the review from the community so far.
>> > 
>> > About the versioning of Spark Operator, there are questions.
>> > 
>> > As we are using Spark JIRA, when we are going to merge PRs, we need to
>> > choose a Spark version. However, the Spark Operator is versioning
>> > differently than Spark. I'm wondering how we deal with this?
>> > 
>> > Not sure if Connect also has its versioning different to Spark? If so,
>> > maybe we can follow how Connect does.
>> > 
>> > Can someone who is familiar with Connect versioning give some suggestions?
>> > 
>> > Thank you.
>> > 
>> > Liang-Chi
>> > 
>> > -
>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
>> > 
>> > 
>> > 
>> 
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
>> 
>> 



Re: Versioning of Spark Operator

2024-04-09 Thread Mridul Muralidharan
  I am trying to understand if we can simply align with Spark's version for
this ?
Makes the release and jira management much more simpler for developers and
intuitive for users.

Regards,
Mridul


On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun  wrote:

> Hi, Liang-Chi.
>
> Thank you for leading Apache Spark K8s operator as a shepherd.
>
> I took a look at `Apache Spark Connect Go` repo mentioned in the thread.
> Sadly, there is no release at all and no activity since last 6 months. It
> seems to be the first time for Apache Spark community to consider these
> sister repositories (Go and K8s Operator).
>
> https://github.com/apache/spark-connect-go/commits/master/
>
> Dongjoon.
>
> On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> > Hi all,
> >
> > We've opened the dedicated repository of Spark Kubernetes Operator,
> > and the first PR is created.
> > Thank you for the review from the community so far.
> >
> > About the versioning of Spark Operator, there are questions.
> >
> > As we are using Spark JIRA, when we are going to merge PRs, we need to
> > choose a Spark version. However, the Spark Operator is versioning
> > differently than Spark. I'm wondering how we deal with this?
> >
> > Not sure if Connect also has its versioning different to Spark? If so,
> > maybe we can follow how Connect does.
> >
> > Can someone who is familiar with Connect versioning give some
> suggestions?
> >
> > Thank you.
> >
> > Liang-Chi
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Versioning of Spark Operator

2024-04-09 Thread Dongjoon Hyun
Hi, Liang-Chi.

Thank you for leading Apache Spark K8s operator as a shepherd. 

I took a look at `Apache Spark Connect Go` repo mentioned in the thread. Sadly, 
there is no release at all and no activity since last 6 months. It seems to be 
the first time for Apache Spark community to consider these sister repositories 
(Go and K8s Operator).

https://github.com/apache/spark-connect-go/commits/master/

Dongjoon.

On 2024/04/08 17:48:18 "L. C. Hsieh" wrote:
> Hi all,
> 
> We've opened the dedicated repository of Spark Kubernetes Operator,
> and the first PR is created.
> Thank you for the review from the community so far.
> 
> About the versioning of Spark Operator, there are questions.
> 
> As we are using Spark JIRA, when we are going to merge PRs, we need to
> choose a Spark version. However, the Spark Operator is versioning
> differently than Spark. I'm wondering how we deal with this?
> 
> Not sure if Connect also has its versioning different to Spark? If so,
> maybe we can follow how Connect does.
> 
> Can someone who is familiar with Connect versioning give some suggestions?
> 
> Thank you.
> 
> Liang-Chi
> 
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 
> 

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org