On Thu, Mar 24, 2016 at 10:13 AM, Reynold Xin <r...@databricks.com> wrote:
> Yes

So is it safe to say the only hard requirements for Java 8 in your list is (4)?

(1) and (3) are infrastructure issues. Yes, it sucks to maintain more
testing infrastructure and potentially more complicated build scripts,
but does that really outweigh maintaining support for Java 7?

A cheap hack would also be to require jdk 1.8 for the build, but still
target java 7. You could then isolate java 8 tests in a separate
module that will get run in all builds because of that requirement.
There are downsides, of course: it's basically the same situation we
were in when we still supported Java 6 but were using jdk 1.7 to build
things. Setting the proper bootclasspath to use jdk 7's rt.jar during
compilation could solve a lot of those. (We already have both JDKs in
jenkins machines as far as I can tell.)

For Scala 2.12, and option might be dropping Java 7 when we decide to
add support for that (unless you're also suggesting Scala 2.12 as part
of 2.0?).

For (2) it seems the jvm used to compile things doesn't really make a
difference. It could be as simple as "we strongly recommend running
Spark 2.0 on Java 8".

Note I'm not for or against the change per se; I'd like to see more
data about what users are really using out there before making that
decision. But there was an explicit desire to maintain java 7
compatibility when we talked about going for Spark 2.0. And with those
kinds of decisions there's always a cost, including spending more
resources on infrastructure and testing.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to