Hi all,

In the test wg meetings I've mentioned the goals I have for optimizing the 
effort required to develop test documentation and coverage. In summary,

*         Tests should be self-documenting - no "test plan" should be needed 
beyond the entries in the test database and the comments in the tests

*         Tests should include specific (identified by an ID) test assertions, 
which provide all the information necessary to understand the steps of the 
test, beyond a general description. For example, for the test 
vHello_Tacker.sh<https://git.opnfv.org/models/tree/tests/vHello_Tacker.sh> see 
the header below.

*         The test assertions can be managed by some database if that's 
necessary and as effective as a simple flat file. For now, a flat file will do 
and they can be further described as needed on a wiki. See 
test-assertions<https://wiki.opnfv.org/display/models/test-assertions> on the 
Models wiki as an example. With the flat file approach can use simple bash 
scripts to change (by sed etc) the IDs as needed (e.g. as they get renamed, 
split, merged, etc... as typically will happen as tests are developed).

*         Test coverage can be assessed by processing the set of test scripts 
to pull out the referenced assertions, and comparing them to the test assertion 
database. Or we can develop the test coverage map by adding assertion pass/fail 
reports (for the discrete assertions in addition to the overall test) to the 
test results database (recommended).

I'd like to get your feedback on this approach. The bottom line goal is that we 
have test documentation and coverage info with the least development and 
maintenance effort.

# What this is: Deployment test for the Tacker Hello World blueprint.
#
# Status: work in progress, planned for OPNFV Danube release.
#
# Use Case Description: A single-node simple python web server, connected to
# two internal networks (private and admin), and accessible via a floating IP.
# Based upon the OpenStack Tacker project's "tosca-vnfd-hello-world" blueprint,
# as extended for testing of more Tacker-supported features as of OpenStack
# Mitaka.
#
# Pre-State:
# models-joid-001 | models-apex-001 (installation of OPNFV system)
#
# Test Steps and Assertions:
# 1) bash vHello_Tacker.sh tacker-cli setup|start|run|stop|clean]
#   models-tacker-001 (Tacker installation in a docker container on the 
jumphost)
#   models-nova-001 (Keypair creation)
# 2) bash vHello_Tacker.sh tacker-cli start
#   models-tacker-002 (VNFD creation)
#   models-tacker-003 (VNF creation)
#   models-tacker-vnfd-001 (config_drive creation)
#   models-tacker-vnfd-002 (artifacts creation)
#   models-tacker-vnfd-003 (user_data creation)
#   models-vhello-001 (vHello VNF creation)
# 3) bash vHello_Tacker.sh tacker-cli stop
#   models-tacker-004 (VNF deletion)
#   models-tacker-005 (VNFD deletion)
#   models-tacker-vnfd-004 (artifacts deletion)
# 4) bash vHello_Tacker.sh tacker-cli clean
#   TODO: add assertions
#
# Post-State:
# After step 1, Tacker is installed and active in a docker container, and the
# test blueprint etc are prepared in a shared virtual folder /tmp/tacker.
# After step 2, the VNF is running and verified.
# After step 3, the VNF is deleted and the system returned to step 1 post-state.
# After step 4, the system returned to test pre-state.
#
# Cleanup: bash vHello_Tacker.sh tacker-cli clean


Thanks,
Bryan Sullivan | AT&T

_______________________________________________
opnfv-tech-discuss mailing list
opnfv-tech-discuss@lists.opnfv.org
https://lists.opnfv.org/mailman/listinfo/opnfv-tech-discuss

Reply via email to