I'll preface my response by saying I don't think there are any hard and
fast rules here, but I'd like us to try
and continue following SemVer rules as much as possible.

On Tue, Aug 15, 2017 at 2:03 PM, Grant Henke <[email protected]> wrote:
>
>
>    - Should/can we drop Spark 1 support in the next minor release?
>

My interpretation is that it's permissible to stop shipping releases of an
artifact at any point (in this case kudu-spark1_2.10),
so I'm all for dropping Spark 1 as soon as we feel there are a sufficiently
low number of users.


>    - Should/can we drop Java 7 support in the next minor release? Does it
>    need to be a major release?
>

My interpretation of SemVer is that we can't drop JRE 7 support without a
major version bump. That being said,
I do think we're quickly approaching the time in which it would be
appropriate to make this step.


>    - How should we support Spark 2.2.0 if we don't drop Java 7? Should we
>    only require Java 1.8 for the Spark 2 modules?
>

Spark has put us in a difficult position here - either kudu-spark2_2.11
remains JRE 7 compatible
and is capped at Spark 2.1, or we make an exception for kudu-spark2_2.11,
drop
JRE 7 compatibility, and continue floating the Spark version against the
latest 2.x release.  I think given the
velocity of the Spark project and the fact that Spark itself doesn't seem
to have any qualms about
breaking SemVer, we should do the latter.


> --
> Grant Henke
> Software Engineer | Cloudera
> [email protected] | twitter.com/gchenke | linkedin.com/in/granthenke
>

Reply via email to