A few other reasons to drop 2.10 support sooner rather than later.

   - We at Lightbend are evaluating some fundamental changes to the REPL to
   make it work better for large heaps, especially for Spark. There are other
   recent and planned enhancements. This work will be benefit notebook users,
   too. However, we won't back port these improvements to 2.10.
   - Scala 2.12 is coming out midyear. It will require Java 8, which means
   it will produce dramatically smaller code (by exploiting lambdas instead of
   custom class generation for functions) and it will offer some performance
   improvements. Hopefully Spark will will support it as an optional Scala
   version relatively quickly after availability, which means it would be nice
   to avoid supporting 3 versions of Scala.

Using Scala 2.10 at this point is like using Java 1.6, seriously out of
date. If you're using libraries that still require 2.10, are you sure that
library is being properly maintained? Or is it a legacy dependency that
should be eliminated before it becomes a liability? Even if you can't
upgrade Scala versions in the next few months, you can certainly continue
using Spark 1.X until you're ready to upgrade.

So, I recommend that Spark 2.0 drop Scala 2.10 support from the beginning.

dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Lightbend <http://lightbend.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, Apr 5, 2016 at 8:54 PM, Kostas Sakellis <kos...@cloudera.com> wrote:

> From both this and the JDK thread, I've noticed (including myself) that
> people have different notions of compatibility guarantees between major and
> minor versions.
> A simple question I have is: What compatibility can we break between minor
> vs. major releases?
>
> It might be worth getting on the same page wrt compatibility guarantees.
>
> Just a thought,
> Kostas
>
> On Tue, Apr 5, 2016 at 4:39 PM, Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> One minor downside to having both 2.10 and 2.11 (and eventually 2.12) is
>> deprecation warnings in our builds that we can't fix without introducing a
>> wrapper/ scala version specific code. This isn't a big deal, and if we drop
>> 2.10 in the 3-6 month time frame talked about we can cleanup those warnings
>> once we get there.
>>
>> On Fri, Apr 1, 2016 at 10:00 PM, Raymond Honderdors <
>> raymond.honderd...@sizmek.com> wrote:
>>
>>> What about a seperate branch for scala 2.10?
>>>
>>>
>>>
>>> Sent from my Samsung Galaxy smartphone.
>>>
>>>
>>> -------- Original message --------
>>> From: Koert Kuipers <ko...@tresata.com>
>>> Date: 4/2/2016 02:10 (GMT+02:00)
>>> To: Michael Armbrust <mich...@databricks.com>
>>> Cc: Matei Zaharia <matei.zaha...@gmail.com>, Mark Hamstra <
>>> m...@clearstorydata.com>, Cody Koeninger <c...@koeninger.org>, Sean
>>> Owen <so...@cloudera.com>, dev@spark.apache.org
>>> Subject: Re: Discuss: commit to Scala 2.10 support for Spark 2.x
>>> lifecycle
>>>
>>> as long as we don't lock ourselves into supporting scala 2.10 for the
>>> entire spark 2 lifespan it sounds reasonable to me
>>>
>>> On Wed, Mar 30, 2016 at 3:25 PM, Michael Armbrust <
>>> mich...@databricks.com> wrote:
>>>
>>>> +1 to Matei's reasoning.
>>>>
>>>> On Wed, Mar 30, 2016 at 9:21 AM, Matei Zaharia <matei.zaha...@gmail.com
>>>> > wrote:
>>>>
>>>>> I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the
>>>>> entire 2.x line. My vote is to keep Scala 2.10 in Spark 2.0, because it's
>>>>> the default version we built with in 1.x. We want to make the transition
>>>>> from 1.x to 2.0 as easy as possible. In 2.0, we'll have the default
>>>>> downloads be for Scala 2.11, so people will more easily move, but we
>>>>> shouldn't create obstacles that lead to fragmenting the community and
>>>>> slowing down Spark 2.0's adoption. I've seen companies that stayed on an
>>>>> old Scala version for multiple years because switching it, or mixing
>>>>> versions, would affect the company's entire codebase.
>>>>>
>>>>> Matei
>>>>>
>>>>> On Mar 30, 2016, at 12:08 PM, Koert Kuipers <ko...@tresata.com> wrote:
>>>>>
>>>>> oh wow, had no idea it got ripped out
>>>>>
>>>>> On Wed, Mar 30, 2016 at 11:50 AM, Mark Hamstra <
>>>>> m...@clearstorydata.com> wrote:
>>>>>
>>>>>> No, with 2.0 Spark really doesn't use Akka:
>>>>>> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744
>>>>>>
>>>>>> On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers <ko...@tresata.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Spark still runs on akka. So if you want the benefits of the latest
>>>>>>> akka (not saying we do, was just an example) then you need to drop scala
>>>>>>> 2.10
>>>>>>> On Mar 30, 2016 10:44 AM, "Cody Koeninger" <c...@koeninger.org>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> I agree with Mark in that I don't see how supporting scala 2.10 for
>>>>>>>> spark 2.0 implies supporting it for all of spark 2.x
>>>>>>>>
>>>>>>>> Regarding Koert's comment on akka, I thought all akka dependencies
>>>>>>>> have been removed from spark after SPARK-7997 and the recent removal
>>>>>>>> of external/akka
>>>>>>>>
>>>>>>>> On Wed, Mar 30, 2016 at 9:36 AM, Mark Hamstra <
>>>>>>>> m...@clearstorydata.com> wrote:
>>>>>>>> > Dropping Scala 2.10 support has to happen at some point, so I'm
>>>>>>>> not
>>>>>>>> > fundamentally opposed to the idea; but I've got questions about
>>>>>>>> how we go
>>>>>>>> > about making the change and what degree of negative consequences
>>>>>>>> we are
>>>>>>>> > willing to accept.  Until now, we have been saying that 2.10
>>>>>>>> support will be
>>>>>>>> > continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial
>>>>>>>> for some
>>>>>>>> > Spark users, so abruptly dropping 2.10 support is very likely to
>>>>>>>> delay
>>>>>>>> > migration to Spark 2.0 for those users.
>>>>>>>> >
>>>>>>>> > What about continuing 2.10 support in 2.0.x, but repeatedly
>>>>>>>> making an
>>>>>>>> > obvious announcement in multiple places that such support is
>>>>>>>> deprecated,
>>>>>>>> > that we are not committed to maintaining it throughout 2.x, and
>>>>>>>> that it is,
>>>>>>>> > in fact, scheduled to be removed in 2.1.0?
>>>>>>>> >
>>>>>>>> > On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com>
>>>>>>>> wrote:
>>>>>>>> >>
>>>>>>>> >> (This should fork as its own thread, though it began during
>>>>>>>> discussion
>>>>>>>> >> of whether to continue Java 7 support in Spark 2.x.)
>>>>>>>> >>
>>>>>>>> >> Simply: would like to more clearly take the temperature of all
>>>>>>>> >> interested parties about whether to support Scala 2.10 in the
>>>>>>>> Spark
>>>>>>>> >> 2.x lifecycle. Some of the arguments appear to be:
>>>>>>>> >>
>>>>>>>> >> Pro
>>>>>>>> >> - Some third party dependencies do not support Scala 2.11+ yet
>>>>>>>> and so
>>>>>>>> >> would not be usable in a Spark app
>>>>>>>> >>
>>>>>>>> >> Con
>>>>>>>> >> - Lower maintenance overhead -- no separate 2.10 build,
>>>>>>>> >> cross-building, tests to check, esp considering support of 2.12
>>>>>>>> will
>>>>>>>> >> be needed
>>>>>>>> >> - Can use 2.11+ features freely
>>>>>>>> >> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to
>>>>>>>> come
>>>>>>>> >>
>>>>>>>> >> I would like to not support 2.10 for Spark 2.x, myself.
>>>>>>>> >>
>>>>>>>> >>
>>>>>>>> ---------------------------------------------------------------------
>>>>>>>> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>>>>>>> >> For additional commands, e-mail: dev-h...@spark.apache.org
>>>>>>>> >>
>>>>>>>> >
>>>>>>>>
>>>>>>>>
>>>>>>>> ---------------------------------------------------------------------
>>>>>>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>>>>>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>>>>>>
>>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> Cell : 425-233-8271
>> Twitter: https://twitter.com/holdenkarau
>>
>
>

Reply via email to