And?

Not only was the change documented, but there was more than one minor release 
with the deprecation in place before the removal of Java 7 and Scala 2.10 
support in the new major release. Java 7 and Scala 2.10 have never been 
anything but deprecated functionality in the Spark 2 API. It is just not the 
Spark PMC's fault if you chose not to follow that deprecation guidance.

> On Aug 15, 2017, at 8:27 PM, Dan Burkert <[email protected]> wrote:
> 
> Hi Mark,
> 
> On Tue, Aug 15, 2017 at 6:49 PM, Mark Hamstra <[email protected]>
> wrote:
> 
>> You are badly mistaken
> 
> 
> My interpretation of SemVer above is based on the definition at SemVer.org,
> which has this to say about when it's appropriate to remove deprecated
> functionality:
> 
>> When you deprecate part of your public API, you should do two things: (1)
> update your documentation to let users know about the change, (2) issue a
> new minor release with the deprecation in place. Before you completely
> remove the functionality in a new major release there should be at least
> one minor release that contains the deprecation so that users can smoothly
> transition to the new API.
> 
> Also relevant, from the same source:
> 
>> Major version X (X.y.z | X > 0) MUST be incremented if any backwards
> incompatible changes are introduced to the public API.
> 
> - Dan
> 
> 
>> 
>> On Tue, Aug 15, 2017 at 2:18 PM, Dan Burkert <[email protected]>
>> wrote:
>> 
>>> I'll preface my response by saying I don't think there are any hard and
>>> fast rules here, but I'd like us to try
>>> and continue following SemVer rules as much as possible.
>>> 
>>> On Tue, Aug 15, 2017 at 2:03 PM, Grant Henke <[email protected]>
>> wrote:
>>>> 
>>>> 
>>>>   - Should/can we drop Spark 1 support in the next minor release?
>>>> 
>>> 
>>> My interpretation is that it's permissible to stop shipping releases of
>> an
>>> artifact at any point (in this case kudu-spark1_2.10),
>>> so I'm all for dropping Spark 1 as soon as we feel there are a
>> sufficiently
>>> low number of users.
>>> 
>>> 
>>>>   - Should/can we drop Java 7 support in the next minor release? Does
>> it
>>>>   need to be a major release?
>>>> 
>>> 
>>> My interpretation of SemVer is that we can't drop JRE 7 support without a
>>> major version bump. That being said,
>>> I do think we're quickly approaching the time in which it would be
>>> appropriate to make this step.
>>> 
>>> 
>>>>   - How should we support Spark 2.2.0 if we don't drop Java 7? Should
>> we
>>>>   only require Java 1.8 for the Spark 2 modules?
>>>> 
>>> 
>>> Spark has put us in a difficult position here - either kudu-spark2_2.11
>>> remains JRE 7 compatible
>>> and is capped at Spark 2.1, or we make an exception for kudu-spark2_2.11,
>>> drop
>>> JRE 7 compatibility, and continue floating the Spark version against the
>>> latest 2.x release.  I think given the
>>> velocity of the Spark project and the fact that Spark itself doesn't seem
>>> to have any qualms about
>>> breaking SemVer, we should do the latter.
>>> 
>>> 
>>>> --
>>>> Grant Henke
>>>> Software Engineer | Cloudera
>>>> [email protected] | twitter.com/gchenke | linkedin.com/in/granthenke
>>>> 
>>> 
>> 

Reply via email to