Hello. There are few types of a deployment regressions possible. When changing a module version to be used from upstream (or internal module repo), for example from Liberty to Mitaka. Or when changing the composition layer (modular tasks in Fuel). Specifically, adding/removing/changing classes and a class parameters.
An example regression for swift deployment data [0]. Something was changed unnoticed by existing noop tests and as a result the swift data became being stored in root partition. Suggested per-commit based regressions detection [1] for deployment data assumes to automatically detect if a class in a noop catalog run has gained or lost a parameter or if it has been updated to another value by a patch under test. Later, this check could even replace existing noop tests, everything will be checked automatically, unless every deployment scenario is covered by a corresponding template, which are represented as YAML files [2] in Fuel. Note: The tool [3] can help to get all deployment cases (-Y) and all deployment tasks (-S) as well. I propose to review the patch [1], understand how it works (see tl;dr section below) and to start using it ASAP. The earlier we commit the "initial" data layer state, less regressions would pop up. (tl;dr) The check should be done for every modular component (aka deployment task). Data generated in the noop catalog run for all classes and defines of a given deployment task should be verified against its "acknowledged" (committed) state. And fail the test gate, if changes has been found, like new parameter with a defined value, removed a parameter, changed a parameter's value. In order to remove a regression, a patch author will have to add (and reviewers should acknowledge) detected changes in the committed state of the deployment data. This may be done manually, with a tool like [3] or by a pre-commit hook, or even at the CI side! The regression check should show the diff between committed state and a new state proposed in a patch. Changed state should be *reviewed* and accepted with a patch, to became a committed one. So the deployment data will evolve with *only* approved changes. And those changes would be very easy to be discovered for each patch under review process! No more regressions, everyone happy. Examples: - A. A patch author removed the mpm_module parameter from the composition layer (apache modular task). The test should fail with a Diff: @@ -90,7 +90,7 @@ manage_user => 'true', max_keepalive_requests => '100', mod_dir => '/etc/httpd/conf.d', - mpm_module => 'false', + mpm_module => 'prefork', name => 'Apache', package_ensure => 'installed', ports_file => '/etc/httpd/conf/ports.conf', It illustrates that the mpm_module's committed value was "false". But the new one came as the 'prefork', likely from the apache class defaults. The solution: Follow the failed build link and see for detected changes (a diff). Acknowledge the changes and include rebuilt templates in the patch as a new revision. The tool [3] (use -h for help) example command: ./utils/jenkins/fuel_noop_tests.rb -q -b -s api-proxy/api-proxy_spec.rb Or edit the committed templates manually and include data changes in the patch as well. -B. An upstream module author added the new parameter mpm_mode with a default '123'. The test should fail with a Diff: @@ -90,6 +90,7 @@ manage_user => 'true', max_keepalive_requests => '100', mod_dir => '/etc/httpd/conf.d', + mpm_mode => '123', mpm_module => 'false', name => 'Apache', package_ensure => 'installed', It illustrates that the composition layer is not consistent with the upstream module data schema, and that could be a potential regression in deployment (new parameter added upstream and goes with defaults, being ignored by the composition manifest). The solution is the same as for the case A. [0] https://bugs.launchpad.net/fuel/+bug/1508482 [1] https://review.openstack.org/240015 [2] https://github.com/openstack/fuel-library/tree/master/tests/noop/astute.yaml [3] https://review.openstack.org/#/c/240015/7/utils/jenkins/fuel_noop_tests.rb -- Best regards, Bogdan Dobrelya, Irc #bogdando __________________________________________________________________________ OpenStack Development Mailing List (not for usage questions) Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev