> Hi, > > let me try to rephrase this a bit and Bogdan will correct me if I'm wrong > or missing something. > > We have a set of top-scope manifests (called Fuel puppet tasks) that we use > for OpenStack deployment. We execute those tasks with "puppet apply". Each > task supposed to bring target system into some desired state, so puppet > compiles a catalog and applies it. So basically, puppet catalog = desired > system state. > > So we can compile* catalogs for all top-scope manifests in master branch > and store those compiled* catalogs in fuel-library repo. Then for each > proposed patch CI will compare new catalogs with stored ones and print out > the difference if any. This will pretty much show what is going to be > changed in system configuration by proposed patch. > > We were discussing such checks before several times, iirc, but we did not > have right tools to implement such thing before. Well, now we do :) I think > it could be quite useful even in non-voting mode. > > * By saying compiled catalogs I don't mean actual/real puppet catalogs, I > mean sorted lists of all classes/resources with all parameters that we find > during puppet-rspec tests in our noop test framework, something like > standard puppet-rspec coverage. See example [0] for networks.pp task [1]. > > Regards, > Alex > > [0] http://paste.openstack.org/show/477839/ > [1] > https://github.com/openstack/fuel-library/blob/master/deployment/puppet/osnailyfacter/modular/openstack-network/networks.pp
Thank you, Alex. Yes, the composition layer is a top-scope manifests, known as a Fuel library modular tasks [0]. The "deployment data checks", is nothing more than comparing the committed vs changed states of fixtures [1] of puppet catalogs for known deployment paths under test with rspecs written for each modular task [2]. And the *current status* is: - the script for data layer checks now implemented [3] - how-to is being documented here [4] - a fix to make catalogs compilation idempotent submitted [5] - and there is my WIP branch [6] with the initial committed state of deploy data pre-generated. So, you can checkout, make any test changes to manifests and run the data check (see the README [4]). It works for me, there is no issues with idempotent re-checks of a clean committed state or tests failing when unexpected. So the plan is to implement this noop tests extention as a non-voting CI gate after I make an example workflow update for developers to the Fuel wiki. Thoughts? [0] https://github.com/openstack/fuel-library/blob/master/deployment/puppet/osnailyfacter/modular [1] https://github.com/openstack/fuel-library/tree/master/tests/noop/astute.yaml [2] https://github.com/openstack/fuel-library/tree/master/tests/noop/spec [3] https://review.openstack.org/240015 [4] https://github.com/openstack/fuel-library/blob/master/tests/noop/README.rst [5] https://review.openstack.org/247989 [6] https://github.com/bogdando/fuel-library-1/commits/data_checks -- Best regards, Bogdan Dobrelya, Irc #bogdando __________________________________________________________________________ OpenStack Development Mailing List (not for usage questions) Unsubscribe: [email protected]?subject:unsubscribe http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
