I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1 release in January and GA a few months later. Of course, nothing is ever certain. What's the thinking for the Spark 3.0 timeline? If it's likely to be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an alternative Scala version.
dean *Dean Wampler, Ph.D.* *VP, Fast Data Engineering at Lightbend* Author: Programming Scala, 2nd Edition <http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures for Streaming Applications <http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>, and other content from O'Reilly @deanwampler <http://twitter.com/deanwampler> https://www.linkedin.com/in/deanwampler/ http://polyglotprogramming.com https://github.com/deanwampler https://www.flickr.com/photos/deanwampler/ On Tue, Nov 6, 2018 at 7:48 PM Sean Owen <sro...@gmail.com> wrote: > That's possible here, sure. The issue is: would you exclude Scala 2.13 > support in 3.0 for this, if it were otherwise ready to go? > I think it's not a hard rule that something has to be deprecated > previously to be removed in a major release. The notice is helpful, > sure, but there are lots of ways to provide that notice to end users. > Lots of things are breaking changes in a major release. Or: deprecate > in Spark 2.4.1, if desired? > > On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cloud0...@gmail.com> wrote: > > > > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in > Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark > 3.x? > > > > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <r...@databricks.com> wrote: > >> > >> Have we deprecated Scala 2.11 already in an existing release? > > --------------------------------------------------------------------- > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >