On 03/31/2015 02:47 PM, Martin Koci wrote:
> Hi all,
> I'd like to open discussion on test categorization into tiers and
> acceptance testing, respectively test tagging which should help us to
> accomplish following goals:
> 1) Acceptance test - other FreeIPA partner projects (389/DS/PKI) should
> be able to have an "Acceptance test" that would run basic *stable* test
> suite that would check if anything significant broke. It should be fast
> enough so that the projects can run it in a Jenkins CI after commits.
> If we also have tags @dogtag or @sssd, the projects could simply run
> just the tests affecting the projects -> faster execution.
> 2) FreeIPA test run optimization. Currently, all FreeIPA tests are
> running when new commit is pushed. This takes lot of resources. It would
> be nice to at least be able to NOT run Tier 2 tests if Tier1 tests are
> failing. Or it would be nice to not run some very expensive tests after
> each commit, but maybe once per day/week.
> So after discussions with couple of developers and QE's we have created
> and summarized following proposal for sorting current IPA tests into
> tiers. 
> Currently used tests reside in freeipa/ipatests. From these the only
> unit tests (tier 0 candidate) are test_{ipalib,ipapython} with the
> exception of test_ipalib/test_rpc.py which requires kerberos.
> The rest of the tests either require ipa/lite-server or are an
> integration test. The rest of the tests (majority XML RPC, UI
> tests, ...) then fall under the definition of Tier 1 test, as they
> require at least running IPA instance and admin TGT.
> As for the tagging of the test cases, pytest's capabilities can be used
> [2]. Though pytest.mark currently does not work with declarative tests
> (it marks all of them), when the test is an ordinary function/method the
> marking works as expected. The declarative tests could be rewritten in
> the future to more pytest specific form, e.g.
> test_xmlrpc/test_host_plugin.py
> Official guideline for this categorization will be created on the
> upstream wiki once we agree on that. 
> As for the acceptance testing Similar to `Test categorization into
> tiers` (1) proposal, there is a need to define a subset of freeipa tests
> that could be run by other projects or users to find out whether or not
> their changes (e.g. new build, feature) works with IPA.
> This run could be composed from tier {0,1} execution followed by a
> subset of integration tests test cases. The proposed mechanism for this
> is the same as in [4], using pytest.mark to select the classes/tests to
> run in this context.
> What I'd like to ask you here is to share any ideas on the form of the
> acceptance run as well as to help me identify the areas (and tests) that
> are considered important and should be a part of this test set.
> Tagging the actual tests classes with pytest decorator
> (http://pytest.org/latest/mark.html). would be better than let
> developers manually maintain lists of tests for different projects. The
> benefit for pytest mark kept in the code is that whatever we do with the
> test class (rename, move, merge), the tag goes with it, not extra list
> needs to be maintained.
> As for tagging itself, the original idea which Martin Kosek was
> proposing was to use just the "acceptance" tag for marking the base T2
> tests that would be part of FreeIPA acceptance tests.
> However, it seems there is a value in tagging the tests that exercise
> also certain sub-component of FreeIPA - SSSD, Dogtag. As long as we do
> not get too wild with the tags, it should be OK. 
> So we could agreed on followings tags:
> - tier0, tier1, tier2
> - acceptance
> - sssd
> - dogtag
> This would lead to e.g.
> @pytest.mark.dogtag
> @pytest.mark.acceptance
> @pytest.mark.tier2
> class TestExternalCA(IntegrationTest):
> ...
> or simpler
> @dogtag
> @acceptance
> @tier2
> class TestExternalCA(IntegrationTest):
> Hope it's not too long and that it makes sense. 

It makes a lot of sense to me (it should, since I contributed to this proposal
too). So I will be looking forward to other developers thought on this.

If there are no objections, we could start with the actual patches and have
them properly reviewed.

> Can I get your thoughts on this, please?
> Thank you.
> Regards,
> /koca
> *[1] - https://fedorahosted.org/freeipa/ticket/4922
> *[2] - http://pytest.org/latest/mark.html

Manage your subscription for the Freeipa-devel mailing list:
Contribute to FreeIPA: http://www.freeipa.org/page/Contribute/Code

Reply via email to