On Tue, Sep 27, 2016 at 2:05 PM, Assaf Muller <[email protected]> wrote:
> > > On Tue, Sep 27, 2016 at 12:18 PM, Timur Nurlygayanov < > [email protected]> wrote: > >> Hi milan, >> >> we have measured the test coverage for OpenStack components with >> coverage.py tool [1]. It is very easy tool and it allows measure the >> coverage by lines of code and etc. (several metrics are available). >> >> [1] https://coverage.readthedocs.io/en/coverage-4.2/ >> > > coverage also supports aggregating results from multiple runs, so you can > measure results from combinations such as: > > 1) Unit tests > 2) Functional tests > 3) Integration tests > 4) 1 + 2 > 5) 1 + 2 + 3 > > To my eyes 3 and 4 make the most sense. Unit and functional tests are > supposed to give you low level coverage, keeping in mind that 'functional > tests' is an overloaded term and actually means something else in every > community. Integration tests aren't about code coverage, they're about user > facing flows, so it'd be interesting to measure coverage > from integration tests, > Sorry, replace integration with unit + functional. > then comparing coverage coming from integration tests, and getting the set > difference between the two: That's the area that needs more unit and > functional tests. > To reiterate: Run coverage from integration tests, let this be c Run coverage from unit and functional tests, let this be c' Let diff = c \ c' 'diff' is where you're missing unit and functional tests coverage. > > >> >> On Tue, Sep 27, 2016 at 1:06 PM, Jordan Pittier < >> [email protected]> wrote: >> >>> Hi, >>> >>> On Tue, Sep 27, 2016 at 11:43 AM, milanisko k <[email protected]> >>> wrote: >>> >>>> Dear Stackers, >>>> I'd like to gather some overview on the $Sub: is there some >>>> infrastructure in place to gather such stats? Are there any groups >>>> interested in it? Any plans to establish such infrastructure? >>>> >>> I am working on such a tool with mixed results so far. Here's my >>> approach taking let's say Nova as an example: >>> >>> 1) Print all the routes known to nova (available as a python-routes >>> object: nova.api.openstack.compute.APIRouterV21()) >>> 2) "Normalize" the Nova routes >>> 3) Take the logs produced by Tempest during a tempest run (in >>> logs/tempest.txt.gz). Grep for what looks like a Nova URL (based on port >>> 8774) >>> 4) "Normalize" the tested-by-tempest Nova routes. >>> 5) Compare the two sets of routes >>> 6) ???? >>> 7) Profit !! >>> >>> So the hard part is obviously the normalizing of the URLs. I am >>> currently using a tons of regex.... :) That's not fun. >>> >>> I'll let you guys know if I have something to show. >>> >>> I think there's real interest on the topic (it comes up every year or >>> so), but no definitive answer/tool. >>> >>> Cheers, >>> Jordan >>> >>> >>> >>> >>> <https://www.scality.com/backup/?utm_source=signatures&utm_medium=email&utm_campaign=backup2016> >>> ____________________________________________________________ >>> ______________ >>> OpenStack Development Mailing List (not for usage questions) >>> Unsubscribe: [email protected] >>> enstack.org?subject:unsubscribe >>> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev >>> >>> >> >> >> -- >> >> Timur, >> Senior QA Manager >> OpenStack Projects >> Mirantis Inc >> >> ____________________________________________________________ >> ______________ >> OpenStack Development Mailing List (not for usage questions) >> Unsubscribe: [email protected]?subject:unsubscrib >> e >> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev >> >> >
__________________________________________________________________________ OpenStack Development Mailing List (not for usage questions) Unsubscribe: [email protected]?subject:unsubscribe http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
