Thanks for your suggestion, Brant. And, I experimented multiple
approaches like that in Tempest, annotating testtools.skipIf above
setUp() method can deliver a similar result.
How about this annotation approach, Joe? This approach can meet both
requirements we said, avoiding code duplications and accurately logging
what are skipped. The demerit of this approach seems less intuitive than
annotating above setUpClass method, but you do not need to spend any
time to implement additional annotation.
Best Regards,
Daisuke Morita
(2013/12/05 8:59), Brant Knudson wrote:
In Keystone, we've got some tests that "raise self.skipTest('...')" in
the test class setUp() method (not setUpClass). My testing shows that if
there's several tests in the class then it shows all of those tests as
skipped (not just 1 skip). Does this do what you want?
Here's an example:
http://git.openstack.org/cgit/openstack/keystone/tree/keystone/tests/test_ipv6.py?id=73dbc00e6ac049f19d0069ecb07ca8ed75627dd5#n30
http://git.openstack.org/cgit/openstack/keystone/tree/keystone/tests/core.py?id=73dbc00e6ac049f19d0069ecb07ca8ed75627dd5#n500
- Brant
On Wed, Dec 4, 2013 at 5:46 AM, Daisuke Morita
<[email protected] <mailto:[email protected]>> wrote:
Hi, everyone.
Which do you think is the best way of coding test skipping, "writing
cls.skipException statement in setUpClass method" or "skipIf annotation
for each test method" ?
This question comes to me in reviewing
https://review.openstack.org/#/c/59759/ . I think that work itself is
great and I hope this patch is merged to Tempest. I just want to focus
on coding styles and explicitness of test outputs.
If skipIf annotation is used, test output of Swift is as follows.
---
tempest.api.object_storage.test_account_quotas.AccountQuotasTest
test_admin_modify_quota[gate,smoke]
SKIP 1.15
test_upload_large_object[gate,negative,smoke]
SKIP 0.03
test_upload_valid_object[gate,smoke]
SKIP 0.03
test_user_modify_quota[gate,negative,smoke]
SKIP 0.03
tempest.api.object_storage.test_account_services.AccountTest
test_create_and_delete_account_metadata[gate,smoke]
OK
0.32
test_list_account_metadata[gate,smoke]
OK
0.02
test_list_containers[gate,smoke]
OK
0.02
...(SKIP)...
Ran 54 tests in 85.977s
OK
---
On the other hand, if cls.skipException is used, an output is changed as
follows.
---
setUpClass (tempest.api.object_storage.test_account_quotas
AccountQuotasTest)
SKIP 0.00
tempest.api.object_storage.test_account_services.AccountTest
test_create_and_delete_account_metadata[gate,smoke]
OK
0.48
test_list_account_metadata[gate,smoke]
OK
0.02
test_list_containers[gate,smoke]
OK
0.02
...(SKIP)...
Ran 49 tests in 81.475s
OK
---
I believe the output of the code using skipIf annotation is better.
Since the coverage of tests is displayed more definitely, it is easier
to find out what tests are really skipped.
I scanned the whole code of Tempest. The count of cls.skipException
statements is 63, and the count of skipIf annotations is 24. Replacing
them is not trivial task, but I think the most impportant for testing is
to output consistent and accurate log.
Am I missing something? Or, this kind of discussion has been done
already in the past? If so, could you let me know?
Best Regards,
--
Daisuke Morita <[email protected]
<mailto:[email protected]>>
NTT Software Innovation Center, NTT Corporation
_______________________________________________
OpenStack-dev mailing list
[email protected]
<mailto:[email protected]>
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
_______________________________________________
OpenStack-dev mailing list
[email protected]
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
--
Daisuke Morita <[email protected]>
NTT Software Innovation Center, NTT Corporation
_______________________________________________
OpenStack-dev mailing list
[email protected]
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev