+1 to drop Java 8 but +1 to set the lowest support version to Java 11.
Considering the phase for only security updates, 11 LTS would not be EOLed
in very long time. Unless that’s coupled with other deps which require
bumping JDK version (hope someone can bring up lists), it doesn’t seem to
buy
Dongjoon,
I followed the conversation, and in my opinion, your concern is totally
legit.
It just feels that the discussion is focused solely on Databricks, and as I
said above, the same issue occurs in other vendors as well.
On Wed, Jun 7, 2023 at 10:28 PM Dongjoon Hyun
wrote:
> To Grisha, we
To Grisha, we are talking about what is the right way and how to comply
with ASF legal advice which I shared in this thread from "legal-discuss@"
mailing thread.
https://lists.apache.org/thread/mzhggd0rpz8t4d7vdsbhkp38mvd3lty4
(legal-discuss@)
Yes, in Spark UI you have it as "3.1.2-amazon", but when you create a
cluster it's just Spark 3.1.2.
On Wed, Jun 7, 2023 at 10:05 PM Nan Zhu wrote:
>
> for EMR, I think they show 3.1.2-amazon in Spark UI, no?
>
>
> On Wed, Jun 7, 2023 at 11:30 Grisha Weintraub
> wrote:
>
>> Hi,
>>
>> I am not
for EMR, I think they show 3.1.2-amazon in Spark UI, no?
On Wed, Jun 7, 2023 at 11:30 Grisha Weintraub
wrote:
> Hi,
>
> I am not taking sides here, but just for fairness, I think it should be
> noted that AWS EMR does exactly the same thing.
> We choose the EMR version (e.g., 6.4.0) and it
Hi,
I am not taking sides here, but just for fairness, I think it should be
noted that AWS EMR does exactly the same thing.
We choose the EMR version (e.g., 6.4.0) and it has an associated Spark
version (e.g., 3.1.2).
The Spark version here is not the original Apache version but AWS Spark
OK, is this the crux of the matter?
We are not asking a big thing ...
First, who are we here? members?
In my opinion, without being overly specific, this discussion has lost its
objectivity. However, with reference to your point, I am sure, a simple
vote will clarify the position in a fairer
I also generally perceive that, after Java 9, there is much less breaking
change. So working on Java 11 probably means it works on 20, or can be
easily made to without pain. Like I think the tweaks for Java 17 were quite
small.
Targeting Java >11 excludes Java 11 users and probably wouldn't buy
So JDK 11 is still supported in open JDK until 2026, I'm not sure if we're
going to see enough folks moving to JRE17 by the Spark 4 release unless we
have a strong benefit from dropping 11 support I'd be inclined to keep it.
On Tue, Jun 6, 2023 at 9:08 PM Dongjoon Hyun wrote:
> I'm also +1 on
I disagree with you in several ways.
The following is not a *minor* change like the given examples (alterations
to the start-up and shutdown scripts, configuration files, file layout
etc.).
> The change you cite meets the 4th point, minor change, made for
integration reasons.
The following is
Hi Dongjoon, I think this conversation is not advancing anymore. I
personally consider the matter closed unless you can find other support or
respond with more specifics. While this perhaps should be on private@, I
think it's not wrong as an instructive discussion on dev@.
I don't believe you've
Sean, it seems that you are confused here. We are not talking about your upper
system (the notebook environment). We are talking about the submodule, "Apache
Spark 3.4.0-databricks". Whatever you call it, both of us knows "Apache Spark
3.4.0-databricks" is different from "Apache Spark 3.4.0".
(With consent, shall we move this to the PMC list?)
No, I don't think that's what this policy says.
First, could you please be more specific here? why do you think a certain
release is at odds with this?
Because so far you've mentioned, I think, not taking a Scala maintenance
release update.
13 matches
Mail list logo