+1 for making Scala 2.12 as default for Spark 3.0.

Bests,
Dongjoon.


On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_t...@apple.com> wrote:

> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed
> work per discussion in Scala community,
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to