+1 on removing Scala 2.11 support for 3.0 given Scala 2.11 is already EOL.
On Tue, Nov 20, 2018 at 2:53 PM Sean Owen wrote:
> PS: pull request at https://github.com/apache/spark/pull/23098
> Not going to merge it until there's clear agreement.
>
> On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue
PS: pull request at https://github.com/apache/spark/pull/23098
Not going to merge it until there's clear agreement.
On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue wrote:
>
> +1 to removing 2.11 support for 3.0 and a PR.
>
> It sounds like having multiple Scala builds is just not feasible and I don't
+1 to removing 2.11 support for 3.0 and a PR.
It sounds like having multiple Scala builds is just not feasible and I
don't think this will be too disruptive for users since it is already a
breaking change.
On Tue, Nov 20, 2018 at 7:05 AM Sean Owen wrote:
> One more data point -- from looking
One more data point -- from looking at the SBT build yesterday, it
seems like most plugin updates require SBT 1.x. And both they and SBT
1.x seem to need Scala 2.12. And the new zinc also does.
Now, the current SBT and zinc and plugins all appear to work OK with
2.12 now, but updating will pretty
>
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>
i actually beg to differ... it's more of a PITA than you might realize
managing more than one PRB (we have two already).
a much better solution would be for the test launching code either in the
PRB config, or scripts in the repo
I support dropping 2.11 support. My general logic is:
- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end
I’d add if folks rely on Twitter in their stack, they might be stuck on
older versions for a while (of their Twitter libs) which might require they
stay on 2.11 for longer than they might otherwise like.
On Friday, November 16, 2018, Marcelo Vanzin
wrote:
> Now that the switch to 2.12 by
Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.
For example, the
This seems fine to me. At least we should be primarily testing against
2.12 now.
Shane will need to alter the current 2.12 master build to actually
test 2.11, but should be a trivial change.
On Thu, Nov 8, 2018 at 12:11 AM DB Tsai wrote:
>
> Based on the discussions, I created a PR that makes
Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.
https://github.com/apache/spark/pull/22967
We can decide later
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:
- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and
2.13 at the same time; always 2.12; now figure out when we stop 2.11
support and start 2.13 support.
On Wed, Nov 7, 2018 at 11:10 AM Sean Owen wrote:
> It's not making 2.12 the default, but not dropping 2.11. Supporting
>
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?
On Tue, Nov 6, 2018 at 5:48 PM Sean Owen wrote:
> That's possible here, sure. The issue is: would you
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1
release in January and GA a few months later. Of course, nothing is ever
certain. What's the thinking for the Spark 3.0 timeline? If it's likely to
be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an
alternative
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to
gt; >
>> > So to clarify, only scala 2.12 is supported in Spark 3?
>> >
>> >
>> > From: Ryan Blue
>> > Sent: Tuesday, November 6, 2018 1:24 PM
>> > To: d_t...@apple.com
>> > Cc: Sean Owen; Spark Dev List; cdelg...@apple.c
v 6, 2018, at 2:55 PM, Felix Cheung
> wrote:
> >
> > So to clarify, only scala 2.12 is supported in Spark 3?
> >
> >
> > From: Ryan Blue
> > Sent: Tuesday, November 6, 2018 1:24 PM
> > To: d_t...@apple.com
> > Cc: Sean Owen; Spark Dev List; cdel
e
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: d_t...@apple.com
> Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>
> +1 to Scala 2.12 as the default in Spark 3.0.
>
> On Tue, Nov 6, 2018 at 11:50
So to clarify, only scala 2.12 is supported in Spark 3?
From: Ryan Blue
Sent: Tuesday, November 6, 2018 1:24 PM
To: d_t...@apple.com
Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
+1 to Scala
+1 to Scala 2.12 as the default in Spark 3.0.
On Tue, Nov 6, 2018 at 11:50 AM DB Tsai wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant
> investment, if we decide not to drop Scala 2.11 in Spark 3.0,
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
As Scala 2.11 will not support Java 11 unless we make a significant investment,
if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only
Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I
I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind
+1 for making Scala 2.12 as default for Spark 3.0.
Bests,
Dongjoon.
On Tue, Nov 6, 2018 at 11:13 AM DB Tsai wrote:
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default
23 matches
Mail list logo