Following https://github.com/apache/spark/pull/12165#issuecomment-205791222
I'd like to make a point about process and then answer points below.

We have this funny system where anyone can propose a change, and any
of a few people can veto a change unilaterally. The latter rarely
comes up. 9 changes out of 10 nobody disagrees on; sometimes a
committer will say 'no' to a change and nobody else with that bit
disagrees.

Sometimes it matters and here I see, what, 4 out of 5 people including
committers supporting a particular change. A veto to oppose that is
pretty drastic. It's not something to use because you or customers
prefer a certain outcome. This reads like you're informing people
you've changed your mind and that's the decision, when it can't work
that way. I saw this happen to a lesser extent in the thread about
Scala 2.10.

It doesn't mean majority rules here either, but can I suggest you
instead counter-propose an outcome that the people here voting in
favor of what you're vetoing would probably also buy into? I bet
everyone's willing to give wide accommodation to your concerns. It's
probably not hard, like: let's plan to not support Java 7 in Spark
2.1.0. (Then we can debate the logic of that.)

On Mon, Apr 4, 2016 at 6:28 AM, Reynold Xin <r...@databricks.com> wrote:
> some technology companies, are still using Java 7. One thing is that up
> until this date, users still can't install openjdk 8 on Ubuntu by default. I
> see that as an indication that it is too early to drop Java 7.

I have Java 8 on my Ubuntu instance, and installed it directly via apt-get.
http://openjdk.java.net/install/


> Looking at the timeline, JDK release a major new version roughly every 3
> years. We dropped Java 6 support one year ago, so from a timeline point of
> view we would be very aggressive here if we were to drop Java 7 support in
> Spark 2.0.

The metric is really (IMHO) when the JDK goes EOL. Java 6 was EOL in
Feb 2013, so supporting it into Spark 1.x was probably too long. Java
7 was EOL in April 2015. It's not really somehow every ~3 years.


> Note that not dropping Java 7 support now doesn't mean we have to support
> Java 7 throughout Spark 2.x. We dropped Java 6 support in Spark 1.5, even
> though Spark 1.0 started with Java 6.

Whatever arguments one has about preventing people from updating to
the latest and greatest then apply to a *minor* release, which is
worse. Java 6 support was probably overdue for removal at 1.0;
better-late-than-never, not necessarily the right time to do it.


> In terms of testing, Josh has actually improved our test infra so now we
> would run the Java 8 tests: https://github.com/apache/spark/pull/12073

Excellent, but, orthogonal.

Even if I personally don't see the merit in these arguments compared
to the counter-arguments, retaining Java 7 support now wouldn't be a
terrible outcome. I'd like to see better process and a more reasonable
compromise result though.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to