I mean from the perspective of someone developing Spark, it makes
things more complicated. It's just my point of view, people that
actually support Spark deployments may have a different opinion ;)

On Thu, Mar 24, 2016 at 2:41 PM, Jakob Odersky <ja...@odersky.com> wrote:
> You can, but since it's going to be a maintainability issue I would
> argue it is in fact a problem.
>
> On Thu, Mar 24, 2016 at 2:34 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
>> Hi Jakob,
>>
>> On Thu, Mar 24, 2016 at 2:29 PM, Jakob Odersky <ja...@odersky.com> wrote:
>>> Reynold's 3rd point is particularly strong in my opinion. Supporting
>>> Consider what would happen if Spark 2.0 doesn't require Java 8 and
>>> hence not support Scala 2.12. Will it be stuck on an older version
>>> until 3.0 is out?
>>
>> That's a false choice. You can support 2.10 (or 2.11) on Java 7 and
>> 2.12 on Java 8.
>>
>> I'm not saying it's a great idea, just that what you're suggesting
>> isn't really a problem.
>>
>> --
>> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to