On 04/03/2017 07:25 PM, Dave Thaler via iotivity-dev wrote:
> One issue is if someone changes a public non-experimental API and changes all 
> the callers in the iotivity project (e.g., all samples) then the break goes 
> undetected.
> Another issue is if there is a public non-experimental API without any unit 
> tests or sample, then the break again goes undetected.
> 
> Another issue is that there?s no convention in use to tag experimental (aka 
> preview) APIs.
> We have @deprecated but nothing like @experimental or @preview.   We should 
> probably define some actual compile-time annotation (not just doxygen) to 
> indicate
> ?preview? (experimental) APIs that may change in the next release.   That?s 
> what we do with Microsoft APIs and it works well.
> Anything without such an annotation could be relied on to not change other 
> than being @deprecated.
> 
> From: C.J. Collier [mailto:cjcollier at linuxfoundation.org]
> Sent: Monday, April 3, 2017 6:19 PM
> To: Dave Thaler <dthaler at microsoft.com>
> Cc: iotivity-dev at lists.iotivity.org
> Subject: Re: [dev] Don't make breaking changes
> 
> Yes.  Let's codify this with JJB defintions, compliance verification tests 
> that feed back to gerrit.  Do not allow code in to the repository which 
> breaks the build.  Run the build successfully before completing a merge.

There are several things...

We do have a system which requires a patch build (and pass the unit
tests) when applied in order to get a +1 from jenkins. I'm not
completely sure if you have to ask for that (jenkins as reviewer), it
has seemed to me it's not.  That's fine - there have been a few "build
is broken now" episodes which indicates occasional lack of discipline as
to when the changes are actually pushed, but those have been essentially
minor hiccups, soon fixed.

Dave was talking about detecting changes to the the API, which is not
the same as a build fail.  You could conceive of having API-validation
tests be part of automated acceptance checks also, but I don't think we
have that now. I've only looked at a small number of the unit tests;
those are useful but they don't seem to be constructed as a complete
validation of the API details.  "Compliance tests" (if we mean running
the CTT?) would help here, but I don't think anyone believes at an
IoTivity API level those are thorough enough yet.  As an outsider, I
haven't heard much about the progress of the validation development
contract, or whether that could be applied at such a fine-grained level
(a gerrit hook that could detect an API-breaking change).

And then I'm not convinced we have a precise enough definition of what
"the API" (one or several) actually is. I think this falls partly in the
area of Dave's observation that we don't have an API Maintainer, but
perhaps even more than that - it's hard to put that all on a person, you
want to have things completely clear in markup and tooling. Again I'm
just pretty much repeating Dave here... so I'll stop restating and leave
it for others to comment.

Reply via email to