Hello,
This explanation is splendidly detailed and requires further understanding.
However, on a first thought with regard to the point raised below and I
quote:
"... There is a company claiming something non-Apache like "Apache Spark
3.4.0 minus SPARK-40436" with the name "Apache Spark 3.4.0."
It goes to "legal-discuss@".
https://lists.apache.org/thread/mzhggd0rpz8t4d7vdsbhkp38mvd3lty4
I hope we can conclude the legal part clearly and shortly in one way or another
which we will follow with confidence.
Dongjoon
On 2023/06/06 20:06:42 Dongjoon Hyun wrote:
> Thank you, Sean, Mich,
Hello Spark developers,
I'm from the Apache Arrow project. We've discussed Java version support [1],
and crucially, whether to continue supporting Java 8 or not. As Spark is a big
user of Arrow in Java, I was curious what Spark's policy here was.
If Spark intends to stay on Java 8, for
I haven't followed this discussion closely, but I think we could/should
drop Java 8 in Spark 4.0, which is up next after 3.5?
On Tue, Jun 6, 2023 at 2:44 PM David Li wrote:
> Hello Spark developers,
>
> I'm from the Apache Arrow project. We've discussed Java version support
> [1], and
+1 on dropping Java 8 in Spark 4.0, saying this as a fan of the fast-paced
(positive) updates to Arrow, eh?!
On Tue, Jun 6, 2023 at 4:02 PM Sean Owen wrote:
> I haven't followed this discussion closely, but I think we could/should
> drop Java 8 in Spark 4.0, which is up next after 3.5?
>
> On
+1 on dropping Java 8 in Spark 4.0, and I even hope Spark 4.0 can only support
Java 17 and the upcoming Java 21.
发件人: Denny Lee
日期: 2023年6月7日 星期三 07:10
收件人: Sean Owen
抄送: David Li , "dev@spark.apache.org"
主题: Re: JDK version support policy?
+1 on dropping Java 8 in Spark 4.0, saying this as
Thank you, Sean, Mich, Holden, again.
For this specific part, let's ask the ASF board via bo...@apache.org to
find a right answer because it's a controversial legal issue here.
> I think you'd just prefer Databricks make a different choice, which is
legitimate, but, an issue to take up with
So I think if the Spark PMC wants to ask Databricks something that could be
reasonable (although I'm a little fuzzy as to the ask), but that
conversation might belong on private@ (I could be wrong of course).
On Tue, Jun 6, 2023 at 3:29 AM Mich Talebzadeh
wrote:
> I concur with you Sean.
>
> If
Hi, All and Matei (as the Chair of Spark PMC).
For the ASF policy violation part, here is a legal recommendation
documentation (draft) from `legal-discuss@`.
https://www.apache.org/foundation/marks/downstream.html#source
> A version number must be used that both clearly differentiates it from
I'm also +1 on dropping both Java 8 and 11 in Apache Spark 4.0, too.
Dongjoon.
On 2023/06/07 02:42:19 yangjie01 wrote:
> +1 on dropping Java 8 in Spark 4.0, and I even hope Spark 4.0 can only
> support Java 17 and the upcoming Java 21.
>
> 发件人: Denny Lee
> 日期: 2023年6月7日 星期三 07:10
> 收件人: Sean
I concur with you Sean.
If I understand correctly the point raised by the thread owner, in
heterogeneous environments that we work, it is up to the practitioner to
ensure that there is version compatibility among OS versions, spark version
and the target artefact in consideration. For example if
11 matches
Mail list logo