On 06/04/2013 09:48 PM, Jakub Hrozek wrote:
On Mon, Jun 03, 2013 at 02:05:09PM +0200, Petr Viktorin wrote:
A design document for integration testing is available at
http://www.freeipa.org/page/V3/Integration_testing. I've copied it below for
easier quoting.


= Overview =

Make it possible to write and run multi-host integration tests (such as:
install master & replica, add user on replica, verify it's added on master).

These tests will be run from continuous integration.
Any developer can also run them manually.

= Use Cases =

== Continuous integration ==

The developer team at Red Hat will run a Jenkins continuous integration
that will run the tests automatically (after each commit if resources are

The CI results will be posted publicly.

== Developer testing ==

Anyone is be able to run integration tests without advanced infrastructure,
only a number of virtual machines to run the tests on is needed.

== Beaker integration ==

The tests will run seamlessly inside [http://beaker-project.org/
Beaker]/[https://fedoraproject.org/wiki/QA/RHTS RHTS].
A special option enables reporting via BeakerLib.

= Non-goals =

A complete testing/continuous integration setup needs some steps that will
be included in IPA's test suite:

* Building the code
* VM provisioning
     There are just too many disparate ways to do it; people
     with a virtual datacenter should already have a preferred tool.
     If we come up with something for ourselves we'll have to make too many
     assumptions for it to be useful somewhere else.
* Configuring the basic system, installing the packages
     Again support for this can be added in the future.
     (Release Puppet/Ansible configuration?)

= Design=

The Python package with the IPA test suite is renamed to <tt>ipatests</tt>,
packaged for RPM-based systems as <tt>freeipa-tests</tt>.
Eventually the package will be included in Fedora.

Integration tests will be controlled from a single machine, and executed
on a number of "remote" machines that act as servers, replicas, clients,
The controlling machine communicates with the others via the SSH protocol.
(The controlling machine may be the same as one of the "remote" ones.)

Integration tests are included in the main IPA set suite, and configured
environment variables. If the variables are missing, all integration tests
If an insufficient number of hosts is configured for a test, the individiual
test will be skipped.

A tool is provided to run installed tests.

The remote machines used for integration testing are required to have
IPA packages installed, firewall opened up, any needed workarounds applied
downgrades, SELinux mode,...), and sshd set up to allow root login.
The test runner will connect to these machines, install IPA, perform the
and then uninstall IPA & return the systems to their previous state.

A plugin for integration with BeakerLib is provided.

= Test configuration =

Tests are configured using these environment variables.

== Host configuration ==

:  FQDN of the first IPA server
:  FQDNs of other IPA servers (space-separated)
:  FQDNs of IPA clients (space-separated)
; $MASTER_env2, $REPLICA_env2, $CLIENT_env2, $MASTER_env3, ...
:  can be used for additional domains when needed

DNS needs to be set up so that IP addresses can be obtained for these hosts.

== Basic configuration ==

:  Directory for test data on the remote hosts
:  Default: /root/ipatests
:  IP of a DNS forwarder
:  Default:

== Test customization ==

:  IPA domain name
:  Default: taken from $MASTER
:  NIS domain name
:  Default: ipatest
:  NIS domain name
:  Default: ipatest
:  Set to TRUE for IPv6-only connectivity
:  Set to enable test debugging

:  Admin username
:  Default: admin
:  Admin user password
:  Default: Secret123
:  Directory manager DN
:  Default: cn=Directory Manager
:  Directory manager password
:  Default: Secret123

= Supporting tools =

== ipa-test-config ==

This tool reads the configuration variables above and outputs a Bash script
that sets a much more complete set of variables for easy shell-based testing
or test set-up.

Without arguments, <tt>ipa-test-config</tt> outputs information specific
to the host it is run on. When given a hostname, it prints config for that
With the <tt>--global</tt> flag, it outputs configuration common to all

== ipa-run-tests ==

This tool is a wrapper arount <tt>nosetests</tt> and accepts the same
as Nose.
It loads any additional plugins and runs tests from the system-installed IPA
test suite.

== Other ==

TBD: Additional command-line tools may be provided for tasks such as
IPA in a given topology.

= Implementation =

Test cases are implemented as Nose test classes, with
installation/uninstallation as class setup/teardown.

A BeakerLib plugin is provided that starts/ends Beaker phases for Nose test
contexts and cases, issues a Beaker assertion (rlPass/rlFail) for each test
case, and collects and submits relevant logs.

A separate plugin will be provided to collect logs outside of a Beaker

= Example instructions =
To run the test called <tt>test_integration/test_simple_replication.py</tt>,
which needs to run with two masters, follow these instructions.

Install the IPA server packages on two machines, and do any preparations
necessary to install IPA (e.g. configure/disable the firewall).

Then, install the <tt>freeipa-tests</tt> package on the machine that will
the tests (this may be one of the machines above, or preferably a different
Set MASTER and REPLICA environment variables to the fully qualified
of the two machines prepared earlier.
Also set any other relevant variables listed in
[[#Test configuration|Test configuration]].
You may run <tt>ipa-test-config --global</tt> to verify how the test
configuration will be handled.

The next steps depend on whether the test will run within a BeakerLib
or not.

How much is the test library tied to IPA? I'd like to reuse it for SSSD
tests, obviously those tests that would be able to run against an IPA
server could be easy, but what about tests that require some other
custom LDAP server? Would the library allow rolling out a more generic
server or would we need to set up a "stable" machine which the tests
would connect to?

The test library is tied to ipapython. In particular the logging infrastructure wouldn't be easy to replace. So you'll need some IPA packages installed on the controlling machine, but no server needs to be running.
On the remote machines, all you need is an SSH-2 server.

The library doesn't "roll out" servers. Everyone's setup can be different (static servers, Beaker, cloud, local VM provisioning, ...). Unfortunately, the only sane way to support everything is to have the user write a script for it. So the tests assume servers are already prepared, up to RPM installation, and they just run ipa-server-install & co. I'll set up a Jenkins instance to run the tests and release much of the configuration, but this will only be an example of how to do it in one particular case. For AD, the tests will expect an AD server fully set up, since you can't really install Windows automatically. You can either do the same for a custom LDAP server, or you can have the tests install/uninstall it.


Freeipa-devel mailing list

Reply via email to