Sonar already tracks the unit test coverage. It is also able to track the integration test coverage, however this might be a bit more sophisticated in CS since not all hardware/software requirements are available in the jenkins environment. However, this could be a problem in any environment.
On Mon, Oct 28, 2013 at 5:53 AM, Prasanna Santhanam <t...@apache.org> wrote: > We need a way to check coverage of (unit+integration) tests. How many > lines of code hit on a deployed system that corresponds to the > component donated/committed. We don't have that for existing tests so > it makes it hard to judge if a feature that comes with tests covers > enough of itself. > > On Sun, Oct 27, 2013 at 11:00:46PM +0100, Laszlo Hornyak wrote: > > Ok, makes sense, but that sounds like even more work :) Can you share the > > plan on how will this work? > > > > > > On Sun, Oct 27, 2013 at 7:54 PM, Darren Shepherd < > > darren.s.sheph...@gmail.com> wrote: > > > > > I think it can't be at a component level because components are too > large. > > > It needs to be at a feature for implementation level. For example, > live > > > storage migration for xen and live storage migration for kvm (don't > know if > > > that's a real thing) would be two separate items. > > > > > > Darren > > > > > > > On Oct 27, 2013, at 10:57 AM, Laszlo Hornyak < > laszlo.horn...@gmail.com> > > > wrote: > > > > > > > > I believe this will be very useful for users. > > > > As far as I understand someone will have to qualify components. What > will > > > > be the method for qualification? I do not think simply the test > coverage > > > > would be right. But then if you want to go deeper, then you need a > bigger > > > > effort testing the components. > > > > > > > > > > > > > > > > On Sun, Oct 27, 2013 at 4:51 PM, Darren Shepherd < > > > > darren.s.sheph...@gmail.com> wrote: > > > > > > > >> I don't know if a similar thing has been talked about before but I > > > >> thought I'd just throws this out there. The ultimate way to ensure > > > >> quality is that we have unit test and integration test coverage on > all > > > >> functionality. That way somebody authors some code, commits to, for > > > >> example, 4.2, but then when we release 4.3, 4.4, etc they aren't on > > > >> the hook to manually tests the functionality with each release. The > > > >> obvious nature of a community project is that people come and go. > If > > > >> a contributor wants to ensure the long term viability of the > > > >> component, they should ensure that there are unit+integration tests. > > > >> > > > >> Now, for whatever reason whether good or bad, it's not always > possible > > > >> to have full integration tests. I don't want to throw down the > gamut > > > >> and say everything must have coverage because that will mean some > > > >> useful code/feature will not get in because of some coverage wasn't > > > >> possible at the time. > > > >> > > > >> What I propose is that for every feature or function we put it in a > > > >> tier of what is the quality of it (very similar to how OpenStack > > > >> qualifies their hypervisor integration). Tier A means unit test and > > > >> integration test coverage gates the release. Tier B means unit test > > > >> coverage gates the release. Tier C mean who knows, it compiled. We > > > >> can go through and classify the components and then as a community > we > > > >> can try to get as much into Tier A as possible. > > > >> > > > >> Darren > > > > > > > > > > > > > > > > -- > > > > > > > > EOF > > > > > > > > > > > -- > > > > EOF > > -- > Prasanna., > > ------------------------ > Powered by BigRock.com > > -- EOF