Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-21 Thread DB Tsai
+1 on removing Scala 2.11 support for 3.0 given  Scala 2.11 is already EOL.

On Tue, Nov 20, 2018 at 2:53 PM Sean Owen  wrote:

> PS: pull request at https://github.com/apache/spark/pull/23098
> Not going to merge it until there's clear agreement.
>
> On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue  wrote:
> >
> > +1 to removing 2.11 support for 3.0 and a PR.
> >
> > It sounds like having multiple Scala builds is just not feasible and I
> don't think this will be too disruptive for users since it is already a
> breaking change.
> >
> > On Tue, Nov 20, 2018 at 7:05 AM Sean Owen  wrote:
> >>
> >> One more data point -- from looking at the SBT build yesterday, it
> >> seems like most plugin updates require SBT 1.x. And both they and SBT
> >> 1.x seem to need Scala 2.12. And the new zinc also does.
> >> Now, the current SBT and zinc and plugins all appear to work OK with
> >> 2.12 now, but updating will pretty much have to wait until 2.11
> >> support goes. (I don't think it's feasible to have two SBT builds.)
> >>
> >> I actually haven't heard an argument for keeping 2.11, compared to the
> >> overhead of maintaining it. Any substantive objections? Would it be
> >> too forward to put out a WIP PR that removes it?
> >>
> >> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen  wrote:
> >> >
> >> > I support dropping 2.11 support. My general logic is:
> >> >
> >> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
> >> > Spark 3 arrives
> >> > - I haven't heard of a critical dependency that has no 2.12
> counterpart
> >> > - 2.11 users can stay on 2.4.x, which will be notionally supported
> >> > through, say, end of 2019
> >> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> >> > experience resolving these differences across these two versions; it's
> >> > a hassle as you need two git clones with different scala versions in
> >> > the project tags
> >> > - The project is already short on resources to support things as it is
> >> > - Dropping things is generally necessary to add new things, to keep
> >> > complexity reasonable -- like Scala 2.13 support
> >> >
> >> > Maintaining a separate PR builder for 2.11 isn't so bad
> >> >
> >> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
> >> >  wrote:
> >> > >
> >> > > Now that the switch to 2.12 by default has been made, it might be
> good
> >> > > to have a serious discussion about dropping 2.11 altogether. Many of
> >> > > the main arguments have already been talked about. But I don't
> >> > > remember anyone mentioning how easy it would be to break the 2.11
> >> > > build now.
> >> > >
> >> > > For example, the following works fine in 2.12 but breaks in 2.11:
> >> > >
> >> > > java.util.Arrays.asList("hi").stream().forEach(println)
> >> > >
> >> > > We had a similar issue when we supported java 1.6 but the builds
> were
> >> > > all on 1.7 by default. Every once in a while something would
> silently
> >> > > break, because PR builds only check the default. And the jenkins
> >> > > builds, which are less monitored, would stay broken for a while.
> >> > >
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
> >
> >
> > --
> > Ryan Blue
> > Software Engineer
> > Netflix
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
> --
- DB Sent from my iPhone


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-20 Thread Sean Owen
PS: pull request at https://github.com/apache/spark/pull/23098
Not going to merge it until there's clear agreement.

On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue  wrote:
>
> +1 to removing 2.11 support for 3.0 and a PR.
>
> It sounds like having multiple Scala builds is just not feasible and I don't 
> think this will be too disruptive for users since it is already a breaking 
> change.
>
> On Tue, Nov 20, 2018 at 7:05 AM Sean Owen  wrote:
>>
>> One more data point -- from looking at the SBT build yesterday, it
>> seems like most plugin updates require SBT 1.x. And both they and SBT
>> 1.x seem to need Scala 2.12. And the new zinc also does.
>> Now, the current SBT and zinc and plugins all appear to work OK with
>> 2.12 now, but updating will pretty much have to wait until 2.11
>> support goes. (I don't think it's feasible to have two SBT builds.)
>>
>> I actually haven't heard an argument for keeping 2.11, compared to the
>> overhead of maintaining it. Any substantive objections? Would it be
>> too forward to put out a WIP PR that removes it?
>>
>> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen  wrote:
>> >
>> > I support dropping 2.11 support. My general logic is:
>> >
>> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
>> > Spark 3 arrives
>> > - I haven't heard of a critical dependency that has no 2.12 counterpart
>> > - 2.11 users can stay on 2.4.x, which will be notionally supported
>> > through, say, end of 2019
>> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
>> > experience resolving these differences across these two versions; it's
>> > a hassle as you need two git clones with different scala versions in
>> > the project tags
>> > - The project is already short on resources to support things as it is
>> > - Dropping things is generally necessary to add new things, to keep
>> > complexity reasonable -- like Scala 2.13 support
>> >
>> > Maintaining a separate PR builder for 2.11 isn't so bad
>> >
>> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
>> >  wrote:
>> > >
>> > > Now that the switch to 2.12 by default has been made, it might be good
>> > > to have a serious discussion about dropping 2.11 altogether. Many of
>> > > the main arguments have already been talked about. But I don't
>> > > remember anyone mentioning how easy it would be to break the 2.11
>> > > build now.
>> > >
>> > > For example, the following works fine in 2.12 but breaks in 2.11:
>> > >
>> > > java.util.Arrays.asList("hi").stream().forEach(println)
>> > >
>> > > We had a similar issue when we supported java 1.6 but the builds were
>> > > all on 1.7 by default. Every once in a while something would silently
>> > > break, because PR builds only check the default. And the jenkins
>> > > builds, which are less monitored, would stay broken for a while.
>> > >
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-20 Thread Ryan Blue
+1 to removing 2.11 support for 3.0 and a PR.

It sounds like having multiple Scala builds is just not feasible and I
don't think this will be too disruptive for users since it is already a
breaking change.

On Tue, Nov 20, 2018 at 7:05 AM Sean Owen  wrote:

> One more data point -- from looking at the SBT build yesterday, it
> seems like most plugin updates require SBT 1.x. And both they and SBT
> 1.x seem to need Scala 2.12. And the new zinc also does.
> Now, the current SBT and zinc and plugins all appear to work OK with
> 2.12 now, but updating will pretty much have to wait until 2.11
> support goes. (I don't think it's feasible to have two SBT builds.)
>
> I actually haven't heard an argument for keeping 2.11, compared to the
> overhead of maintaining it. Any substantive objections? Would it be
> too forward to put out a WIP PR that removes it?
>
> On Sat, Nov 17, 2018 at 7:28 PM Sean Owen  wrote:
> >
> > I support dropping 2.11 support. My general logic is:
> >
> > - 2.11 is EOL, and is all the more EOL in the middle of next year when
> > Spark 3 arrives
> > - I haven't heard of a critical dependency that has no 2.12 counterpart
> > - 2.11 users can stay on 2.4.x, which will be notionally supported
> > through, say, end of 2019
> > - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> > experience resolving these differences across these two versions; it's
> > a hassle as you need two git clones with different scala versions in
> > the project tags
> > - The project is already short on resources to support things as it is
> > - Dropping things is generally necessary to add new things, to keep
> > complexity reasonable -- like Scala 2.13 support
> >
> > Maintaining a separate PR builder for 2.11 isn't so bad
> >
> > On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
> >  wrote:
> > >
> > > Now that the switch to 2.12 by default has been made, it might be good
> > > to have a serious discussion about dropping 2.11 altogether. Many of
> > > the main arguments have already been talked about. But I don't
> > > remember anyone mentioning how easy it would be to break the 2.11
> > > build now.
> > >
> > > For example, the following works fine in 2.12 but breaks in 2.11:
> > >
> > > java.util.Arrays.asList("hi").stream().forEach(println)
> > >
> > > We had a similar issue when we supported java 1.6 but the builds were
> > > all on 1.7 by default. Every once in a while something would silently
> > > break, because PR builds only check the default. And the jenkins
> > > builds, which are less monitored, would stay broken for a while.
> > >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-20 Thread Sean Owen
One more data point -- from looking at the SBT build yesterday, it
seems like most plugin updates require SBT 1.x. And both they and SBT
1.x seem to need Scala 2.12. And the new zinc also does.
Now, the current SBT and zinc and plugins all appear to work OK with
2.12 now, but updating will pretty much have to wait until 2.11
support goes. (I don't think it's feasible to have two SBT builds.)

I actually haven't heard an argument for keeping 2.11, compared to the
overhead of maintaining it. Any substantive objections? Would it be
too forward to put out a WIP PR that removes it?

On Sat, Nov 17, 2018 at 7:28 PM Sean Owen  wrote:
>
> I support dropping 2.11 support. My general logic is:
>
> - 2.11 is EOL, and is all the more EOL in the middle of next year when
> Spark 3 arrives
> - I haven't heard of a critical dependency that has no 2.12 counterpart
> - 2.11 users can stay on 2.4.x, which will be notionally supported
> through, say, end of 2019
> - Maintaining 2.11 vs 2.12 support is modestly difficult, in my
> experience resolving these differences across these two versions; it's
> a hassle as you need two git clones with different scala versions in
> the project tags
> - The project is already short on resources to support things as it is
> - Dropping things is generally necessary to add new things, to keep
> complexity reasonable -- like Scala 2.13 support
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>
> On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
>  wrote:
> >
> > Now that the switch to 2.12 by default has been made, it might be good
> > to have a serious discussion about dropping 2.11 altogether. Many of
> > the main arguments have already been talked about. But I don't
> > remember anyone mentioning how easy it would be to break the 2.11
> > build now.
> >
> > For example, the following works fine in 2.12 but breaks in 2.11:
> >
> > java.util.Arrays.asList("hi").stream().forEach(println)
> >
> > We had a similar issue when we supported java 1.6 but the builds were
> > all on 1.7 by default. Every once in a while something would silently
> > break, because PR builds only check the default. And the jenkins
> > builds, which are less monitored, would stay broken for a while.
> >

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-19 Thread shane knapp
>
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>

i actually beg to differ...  it's more of a PITA than you might realize
managing more than one PRB (we have two already).

a much better solution would be for the test launching code either in the
PRB config, or scripts in the repo manage this.
-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-17 Thread Sean Owen
I support dropping 2.11 support. My general logic is:

- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end of 2019
- Maintaining 2.11 vs 2.12 support is modestly difficult, in my
experience resolving these differences across these two versions; it's
a hassle as you need two git clones with different scala versions in
the project tags
- The project is already short on resources to support things as it is
- Dropping things is generally necessary to add new things, to keep
complexity reasonable -- like Scala 2.13 support

Maintaining a separate PR builder for 2.11 isn't so bad

On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
 wrote:
>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-16 Thread Justin Miller
I’d add if folks rely on Twitter in their stack, they might be stuck on
older versions for a while (of their Twitter libs) which might require they
stay on 2.11 for longer than they might otherwise like.

On Friday, November 16, 2018, Marcelo Vanzin 
wrote:

> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>
> On Tue, Nov 6, 2018 at 11:13 AM DB Tsai  wrote:
> >
> > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
> >
> > Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely
> to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community, https://github.com/scala/
> scala-dev/issues/559#issuecomment-436160166
> >
> > We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
> >
> > What do you think?
> >
> > Thanks,
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
>
>
> --
> Marcelo
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 

Justin Miller
Senior Data Engineer
*GoSpotCheck*
Direct: 720-517-3979 <+17205173979>
Email: jus...@gospotcheck.com

September 24-26, 2018
Denver, Colorado Learn More and Register



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-16 Thread Marcelo Vanzin
Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.

For example, the following works fine in 2.12 but breaks in 2.11:

java.util.Arrays.asList("hi").stream().forEach(println)

We had a similar issue when we supported java 1.6 but the builds were
all on 1.7 by default. Every once in a while something would silently
break, because PR builds only check the default. And the jenkins
builds, which are less monitored, would stay broken for a while.

On Tue, Nov 6, 2018 at 11:13 AM DB Tsai  wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark 
> version will be 3.0, so it's a great time to discuss should we make Scala 
> 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work 
> per discussion in Scala community, 
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make 
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on 
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>


-- 
Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-08 Thread Sean Owen
This seems fine to me. At least we should be primarily testing against
2.12 now.
Shane will need to alter the current 2.12 master build to actually
test 2.11, but should be a trivial change.

On Thu, Nov 8, 2018 at 12:11 AM DB Tsai  wrote:
>
> Based on the discussions, I created a PR that makes Spark's default
> Scala version as 2.12, and then Scala 2.11 will be the alternative
> version. This implies that Scala 2.12 will be used by our CI builds
> including pull request builds.
>
> https://github.com/apache/spark/pull/22967
>
> We can decide later if we want to change the alternative Scala version
> to 2.13 and drop 2.11 if we just want to support two Scala versions at
> one time.
>
> Thanks.
>
> Sincerely,
>
> DB Tsai
> --
> Web: https://www.dbtsai.com
> PGP Key ID: 0x5CED8B896A6BDFA0
> On Wed, Nov 7, 2018 at 11:18 AM Sean Owen  wrote:
> >
> > It's not making 2.12 the default, but not dropping 2.11. Supporting
> > 2.13 could mean supporting 3 Scala versions at once, which I claim is
> > just too much. I think the options are likely:
> >
> > - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> > default. Add 2.13 support in 3.x and drop 2.11 in the same release
> > - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> > Drop 2.11 support in Spark 3.0, and support only 2.12.
> > - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
> >
> >
> > On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra  
> > wrote:
> > >
> > > I'm not following "exclude Scala 2.13". Is there something inherent in 
> > > making 2.12 the default Scala version in Spark 3.0 that would prevent us 
> > > from supporting the option of building with 2.13?
> > >
> > > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen  wrote:
> > >>
> > >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> > >> support in 3.0 for this, if it were otherwise ready to go?
> > >> I think it's not a hard rule that something has to be deprecated
> > >> previously to be removed in a major release. The notice is helpful,
> > >> sure, but there are lots of ways to provide that notice to end users.
> > >> Lots of things are breaking changes in a major release. Or: deprecate
> > >> in Spark 2.4.1, if desired?
> > >>
> > >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
> > >> >
> > >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 
> > >> > in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of 
> > >> > Spark 3.x?
> > >> >
> > >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
> > >> >>
> > >> >> Have we deprecated Scala 2.11 already in an existing release?
> > >>
> > >> -
> > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >>
> >
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-08 Thread DB Tsai
Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.

https://github.com/apache/spark/pull/22967

We can decide later if we want to change the alternative Scala version
to 2.13 and drop 2.11 if we just want to support two Scala versions at
one time.

Thanks.

Sincerely,

DB Tsai
--
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0
On Wed, Nov 7, 2018 at 11:18 AM Sean Owen  wrote:
>
> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra  wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in 
> > making 2.12 the default Scala version in Spark 3.0 that would prevent us 
> > from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen  wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in 
> >> > Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 
> >> > 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-07 Thread Sean Owen
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:

- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop 2.11 in the same release
- Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
Drop 2.11 support in Spark 3.0, and support only 2.12.
- (same as above, but add Spark 2.13 support if possible for Spark 3.0)


On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra  wrote:
>
> I'm not following "exclude Scala 2.13". Is there something inherent in making 
> 2.12 the default Scala version in Spark 3.0 that would prevent us from 
> supporting the option of building with 2.13?
>
> On Tue, Nov 6, 2018 at 5:48 PM Sean Owen  wrote:
>>
>> That's possible here, sure. The issue is: would you exclude Scala 2.13
>> support in 3.0 for this, if it were otherwise ready to go?
>> I think it's not a hard rule that something has to be deprecated
>> previously to be removed in a major release. The notice is helpful,
>> sure, but there are lots of ways to provide that notice to end users.
>> Lots of things are breaking changes in a major release. Or: deprecate
>> in Spark 2.4.1, if desired?
>>
>> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
>> >
>> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in 
>> > Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 
>> > 3.x?
>> >
>> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
>> >>
>> >> Have we deprecated Scala 2.11 already in an existing release?
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-07 Thread Mark Hamstra
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and
2.13 at the same time; always 2.12; now figure out when we stop 2.11
support and start 2.13 support.

On Wed, Nov 7, 2018 at 11:10 AM Sean Owen  wrote:

> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra 
> wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in
> making 2.12 the default Scala version in Spark 3.0 that would prevent us
> from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen  wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10
> in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin 
> wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-07 Thread Mark Hamstra
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?

On Tue, Nov 6, 2018 at 5:48 PM Sean Owen  wrote:

> That's possible here, sure. The issue is: would you exclude Scala 2.13
> support in 3.0 for this, if it were otherwise ready to go?
> I think it's not a hard rule that something has to be deprecated
> previously to be removed in a major release. The notice is helpful,
> sure, but there are lots of ways to provide that notice to end users.
> Lots of things are breaking changes in a major release. Or: deprecate
> in Spark 2.4.1, if desired?
>
> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
> >
> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >
> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
> >>
> >> Have we deprecated Scala 2.11 already in an existing release?
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-07 Thread Dean Wampler
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1
release in January and GA a few months later. Of course, nothing is ever
certain. What's the thinking for the Spark 3.0 timeline? If it's likely to
be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an
alternative Scala version.

dean


*Dean Wampler, Ph.D.*

*VP, Fast Data Engineering at Lightbend*
Author: Programming Scala, 2nd Edition
, Fast Data Architectures
for Streaming Applications
,
and other content from O'Reilly
@deanwampler 
https://www.linkedin.com/in/deanwampler/
http://polyglotprogramming.com
https://github.com/deanwampler
https://www.flickr.com/photos/deanwampler/


On Tue, Nov 6, 2018 at 7:48 PM Sean Owen  wrote:

> That's possible here, sure. The issue is: would you exclude Scala 2.13
> support in 3.0 for this, if it were otherwise ready to go?
> I think it's not a hard rule that something has to be deprecated
> previously to be removed in a major release. The notice is helpful,
> sure, but there are lots of ways to provide that notice to end users.
> Lots of things are breaking changes in a major release. Or: deprecate
> in Spark 2.4.1, if desired?
>
> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
> >
> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
> 3.x?
> >
> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
> >>
> >> Have we deprecated Scala 2.11 already in an existing release?
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Sean Owen
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to provide that notice to end users.
Lots of things are breaking changes in a major release. Or: deprecate
in Spark 2.4.1, if desired?

On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan  wrote:
>
> We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in 
> Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>
> On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:
>>
>> Have we deprecated Scala 2.11 already in an existing release?

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Wenchen Fan
We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
3.x?

On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin  wrote:

> Have we deprecated Scala 2.11 already in an existing release?
>
> On Tue, Nov 6, 2018 at 4:43 PM DB Tsai  wrote:
>
>> Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> > On Nov 6, 2018, at 2:55 PM, Felix Cheung 
>> wrote:
>> >
>> > So to clarify, only scala 2.12 is supported in Spark 3?
>> >
>> >
>> > From: Ryan Blue 
>> > Sent: Tuesday, November 6, 2018 1:24 PM
>> > To: d_t...@apple.com
>> > Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
>> > Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>> >
>> > +1 to Scala 2.12 as the default in Spark 3.0.
>> >
>> > On Tue, Nov 6, 2018 at 11:50 AM DB Tsai  wrote:
>> > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>> >
>> > As Scala 2.11 will not support Java 11 unless we make a significant
>> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
>> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
>> Java 8. But I agree with Sean that this can make the decencies really
>> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>> >
>> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >
>> >> On Nov 6, 2018, at 11:38 AM, Sean Owen  wrote:
>> >>
>> >> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> >> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> >> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> >> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> >> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>> >>
>> >> Java (9-)11 support also complicates this. I think getting it to work
>> >> will need some significant dependency updates, and I worry not all
>> >> will be available for 2.11 or will present some knotty problems. We'll
>> >> find out soon if that forces the issue.
>> >>
>> >> Also note that Scala 2.13 is pretty close to release, and we'll want
>> >> to support it soon after release, perhaps sooner than the long delay
>> >> before 2.12 was supported (because it was hard!). It will probably be
>> >> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> >> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> >> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> >> the release of Spark 3.0, I wonder if that too argues for dropping
>> >> 2.11 support.
>> >>
>> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> >> while, no matter what; it still exists in the 2.4.x branch of course.
>> >> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>> >>
>> >> Sean
>> >>
>> >>
>> >> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
>> >>>
>> >>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the
>> next Spark version will be 3.0, so it's a great time to discuss should we
>> make Scala 2.12 as default Scala version in Spark 3.0.
>> >>>
>> >>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's
>> unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor
>> the needed work per discussion in Scala community,
>> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>> >>>
>> >>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to
>> make Scala 2.12 as default for Spark 3.0 now, we will have ample time to
>> work on bugs and issues that we may run into.
>> >>>
>> >>> What do you think?
>> >>>
>> >>> Thanks,
>> >>>
>> >>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >>>
>> >>>
>> >>> -
>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>>
>> >
>> >
>> >
>> > --
>> > Ryan Blue
>> > Software Engineer
>> > Netflix
>>
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Reynold Xin
Have we deprecated Scala 2.11 already in an existing release?

On Tue, Nov 6, 2018 at 4:43 PM DB Tsai  wrote:

> Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
> > On Nov 6, 2018, at 2:55 PM, Felix Cheung 
> wrote:
> >
> > So to clarify, only scala 2.12 is supported in Spark 3?
> >
> >
> > From: Ryan Blue 
> > Sent: Tuesday, November 6, 2018 1:24 PM
> > To: d_t...@apple.com
> > Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
> > Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
> >
> > +1 to Scala 2.12 as the default in Spark 3.0.
> >
> > On Tue, Nov 6, 2018 at 11:50 AM DB Tsai  wrote:
> > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
> >
> > As Scala 2.11 will not support Java 11 unless we make a significant
> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
> Java 8. But I agree with Sean that this can make the decencies really
> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >
> >> On Nov 6, 2018, at 11:38 AM, Sean Owen  wrote:
> >>
> >> I think we should make Scala 2.12 the default in Spark 3.0. I would
> >> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> >> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> >> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> >> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
> >>
> >> Java (9-)11 support also complicates this. I think getting it to work
> >> will need some significant dependency updates, and I worry not all
> >> will be available for 2.11 or will present some knotty problems. We'll
> >> find out soon if that forces the issue.
> >>
> >> Also note that Scala 2.13 is pretty close to release, and we'll want
> >> to support it soon after release, perhaps sooner than the long delay
> >> before 2.12 was supported (because it was hard!). It will probably be
> >> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> >> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> >> and 2.13, or something. But if 2.13 support is otherwise attainable at
> >> the release of Spark 3.0, I wonder if that too argues for dropping
> >> 2.11 support.
> >>
> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> >> while, no matter what; it still exists in the 2.4.x branch of course.
> >> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
> >>
> >> Sean
> >>
> >>
> >> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
> >>>
> >>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the
> next Spark version will be 3.0, so it's a great time to discuss should we
> make Scala 2.12 as default Scala version in Spark 3.0.
> >>>
> >>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely
> to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
> >>>
> >>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to
> make Scala 2.12 as default for Spark 3.0 now, we will have ample time to
> work on bugs and issues that we may run into.
> >>>
> >>> What do you think?
> >>>
> >>> Thanks,
> >>>
> >>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
> >>>
> >>>
> >>> -
> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>>
> >
> >
> >
> > --
> > Ryan Blue
> > Software Engineer
> > Netflix
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread DB Tsai
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 2:55 PM, Felix Cheung  wrote:
> 
> So to clarify, only scala 2.12 is supported in Spark 3?
> 
>  
> From: Ryan Blue 
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: d_t...@apple.com
> Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>  
> +1 to Scala 2.12 as the default in Spark 3.0.
> 
> On Tue, Nov 6, 2018 at 11:50 AM DB Tsai  wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 
> 
> As Scala 2.11 will not support Java 11 unless we make a significant 
> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do 
> is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 
> 8. But I agree with Sean that this can make the decencies really complicated; 
> hence I support to drop Scala 2.11 in Spark 3.0 directly.
> 
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
> 
>> On Nov 6, 2018, at 11:38 AM, Sean Owen  wrote:
>> 
>> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>> 
>> Java (9-)11 support also complicates this. I think getting it to work
>> will need some significant dependency updates, and I worry not all
>> will be available for 2.11 or will present some knotty problems. We'll
>> find out soon if that forces the issue.
>> 
>> Also note that Scala 2.13 is pretty close to release, and we'll want
>> to support it soon after release, perhaps sooner than the long delay
>> before 2.12 was supported (because it was hard!). It will probably be
>> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> the release of Spark 3.0, I wonder if that too argues for dropping
>> 2.11 support.
>> 
>> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> while, no matter what; it still exists in the 2.4.x branch of course.
>> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>> 
>> Sean
>> 
>> 
>> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
>>> 
>>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next 
>>> Spark version will be 3.0, so it's a great time to discuss should we make 
>>> Scala 2.12 as default Scala version in Spark 3.0.
>>> 
>>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
>>> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed 
>>> work per discussion in Scala community, 
>>> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>>> 
>>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make 
>>> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on 
>>> bugs and issues that we may run into.
>>> 
>>> What do you think?
>>> 
>>> Thanks,
>>> 
>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
>>> Inc
>>> 
>>> 
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>> 
> 
> 
> 
> -- 
> Ryan Blue
> Software Engineer
> Netflix


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Felix Cheung
So to clarify, only scala 2.12 is supported in Spark 3?



From: Ryan Blue 
Sent: Tuesday, November 6, 2018 1:24 PM
To: d_t...@apple.com
Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0

+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai 
mailto:d_t...@apple.com>> wrote:
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.

As Scala 2.11 will not support Java 11 unless we make a significant investment, 
if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only 
Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree 
with Sean that this can make the decencies really complicated; hence I support 
to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 11:38 AM, Sean Owen 
mailto:sro...@gmail.com>> wrote:

I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai 
mailto:d_t...@apple.com>> wrote:

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark 
version will be 3.0, so it's a great time to discuss should we make Scala 2.12 
as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work 
per discussion in Scala community, 
https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 
2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and 
issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


-
To unsubscribe e-mail: 
dev-unsubscr...@spark.apache.org<mailto:dev-unsubscr...@spark.apache.org>




--
Ryan Blue
Software Engineer
Netflix


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Ryan Blue
+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai  wrote:

> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant
> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
> Java 8. But I agree with Sean that this can make the decencies really
> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
> On Nov 6, 2018, at 11:38 AM, Sean Owen  wrote:
>
> I think we should make Scala 2.12 the default in Spark 3.0. I would
> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>
> Java (9-)11 support also complicates this. I think getting it to work
> will need some significant dependency updates, and I worry not all
> will be available for 2.11 or will present some knotty problems. We'll
> find out soon if that forces the issue.
>
> Also note that Scala 2.13 is pretty close to release, and we'll want
> to support it soon after release, perhaps sooner than the long delay
> before 2.12 was supported (because it was hard!). It will probably be
> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> and 2.13, or something. But if 2.13 support is otherwise attainable at
> the release of Spark 3.0, I wonder if that too argues for dropping
> 2.11 support.
>
> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> while, no matter what; it still exists in the 2.4.x branch of course.
> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>
> Sean
>
>
> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
>
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 
>
>
>

-- 
Ryan Blue
Software Engineer
Netflix


Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread DB Tsai
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 

As Scala 2.11 will not support Java 11 unless we make a significant investment, 
if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only 
Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree 
with Sean that this can make the decencies really complicated; hence I support 
to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 11:38 AM, Sean Owen  wrote:
> 
> I think we should make Scala 2.12 the default in Spark 3.0. I would
> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
> 
> Java (9-)11 support also complicates this. I think getting it to work
> will need some significant dependency updates, and I worry not all
> will be available for 2.11 or will present some knotty problems. We'll
> find out soon if that forces the issue.
> 
> Also note that Scala 2.13 is pretty close to release, and we'll want
> to support it soon after release, perhaps sooner than the long delay
> before 2.12 was supported (because it was hard!). It will probably be
> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
> and 2.13, or something. But if 2.13 support is otherwise attainable at
> the release of Spark 3.0, I wonder if that too argues for dropping
> 2.11 support.
> 
> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> while, no matter what; it still exists in the 2.4.x branch of course.
> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
> 
> Sean
> 
> 
> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
>> 
>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next 
>> Spark version will be 3.0, so it's a great time to discuss should we make 
>> Scala 2.12 as default Scala version in Spark 3.0.
>> 
>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
>> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work 
>> per discussion in Scala community, 
>> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>> 
>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make 
>> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on 
>> bugs and issues that we may run into.
>> 
>> What do you think?
>> 
>> Thanks,
>> 
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
>> Inc
>> 
>> 
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> 



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Sean Owen
I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai  wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark 
> version will be 3.0, so it's a great time to discuss should we make Scala 
> 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work 
> per discussion in Scala community, 
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make 
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on 
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Make Scala 2.12 as default Scala version in Spark 3.0

2018-11-06 Thread Dongjoon Hyun
+1 for making Scala 2.12 as default for Spark 3.0.

Bests,
Dongjoon.


On Tue, Nov 6, 2018 at 11:13 AM DB Tsai  wrote:

> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>