: Wednesday, April 10, 2024 6:14 PM
To: Dongjoon Hyun
Cc: dev@spark.apache.org
Subject: [External] Re: Versioning of Spark Operator
This approach makes sense to me.
If Spark K8s operator is aligned with Spark versions, for example, it
uses 4.0.0 now.
Because these JIRA tickets are not actually
This approach makes sense to me.
If Spark K8s operator is aligned with Spark versions, for example, it
uses 4.0.0 now.
Because these JIRA tickets are not actually targeting Spark 4.0.0, it
will cause confusion and more questions, like when we are going to cut
Spark release,
should we include Spark
Cool, looks like we have two options here.
Option 1: Spark Operator and Connect Go Client versioning independent of
Spark, e.g. starting with 0.1.0.
Pros: they can evolve versions independently.
Cons: people will need an extra step to decide the version when using Spark
Operator and Connect Go Cli
Ya, that would work.
Inevitably, I looked at Apache Flink K8s Operator's JIRA and GitHub repo.
It looks reasonable to me.
Although they share the same JIRA, they choose different patterns per place.
1. In POM file and Maven Artifact, independent version number.
1.8.0
2. Tag is also based on th
Yea, I guess, for example, the first release of Spark K8s Operator
would be something like 0.1.0 instead of 4.0.0.
It sounds hard to align with Spark versions because of that?
On Tue, Apr 9, 2024 at 10:15 AM Dongjoon Hyun wrote:
>
> Ya, that's simple and possible.
>
> However, it may cause many
For Spark Operator, I think the answer is yes. According to my
impression, Spark Operator should be Spark version-agnostic. Zhou,
please correct me if I'm wrong.
I am not sure about the Spark Connector Go client, but if it is going
to talk with Spark cluster, I guess it should be still related to
S
Do we have a compatibility matrix of Apache Connect Go client already, Bo?
Specifically, I'm wondering which versions the existing Apache Spark Connect Go
repository is able to support as of now.
We know that it is supposed to be compatible always, but do we have a way to
verify that actually v
Thanks Liang-Chi for the Spark Operator work, and also the discussion here!
For Spark Operator and Connector Go Client, I am guessing they need to
support multiple versions of Spark? e.g. same Spark Operator may support
running multiple versions of Spark, and Connector Go Client might support
mult
Ya, that's simple and possible.
However, it may cause many confusions because it implies that new `Spark K8s
Operator 4.0.0` and `Spark Connect Go 4.0.0` follow the same `Semantic
Versioning` policy like Apache Spark 4.0.0.
In addition, `Versioning` is directly related to the Release Cadence. I
Aligning with Spark releases is sensible, as it allows us to guarantee that
the Spark operator functions correctly with the new version while also
maintaining support for previous versions.
DB Tsai | https://www.dbtsai.com/ | PGP 42E5B25A8F7A82C1
> On Apr 9, 2024, at 9:45 AM, Mridul Mural
I am trying to understand if we can simply align with Spark's version for
this ?
Makes the release and jira management much more simpler for developers and
intuitive for users.
Regards,
Mridul
On Tue, Apr 9, 2024 at 10:09 AM Dongjoon Hyun wrote:
> Hi, Liang-Chi.
>
> Thank you for leading Apa
Hi, Liang-Chi.
Thank you for leading Apache Spark K8s operator as a shepherd.
I took a look at `Apache Spark Connect Go` repo mentioned in the thread. Sadly,
there is no release at all and no activity since last 6 months. It seems to be
the first time for Apache Spark community to consider the
12 matches
Mail list logo