Some Qs:

1) does the configuration only mean: "We have this topology, run all test which fits into it"? IDK how should Web UI or XML-RPC tests fit into it. The difference is that configuration for Web UI tests should mean: "This is how FreeIPA and related stuff is installed. Tests all available functionality and don't test the missing." IE. if I install freeipa server without CA, the UI tests needs to get that info (ie. by no_ca flag) and then skip certificate tests or parts of other tests which touches this feature (like navigation tests). The same for no-dns and has-trust...

Complete UI testing would be something like following:
  a) set configuration A
  b) install server with configuration A
  c) run all UI tests (some may be skipped)
          i) in Firefox
         ii) in IE
        iii) in Chrome
  d) uninstall server
  e) set configuration B
  f) install server with configuration B
  g) run all UI tests (some may be skipped)
          i) in Firefox
         ii) in IE
        iii) in Chrome
  h) uninstall server
  i) repeat for config C

Note:: browser change is also done by change of configuration.

Is it possible? Can the configuration change or should there be other level of configuration which can change? Do we have time to do it? It may take almost a day (sequentially, with full coverage).

Does python nose support some master tests or can it be somehow configured in Jenkins so we can automate installation of IPA with different configurations (IE by the 'To-be-done command-line tools').

2) There is no configuration options for trusts. IMO we would like to test that.

3) Test runner needs to connect to remote machine as root. Should a config option for root password be added or can we safely assume that there will always be other authentication method available?


On 06/03/2013 02:05 PM, Petr Viktorin wrote:
A design document for integration testing is available at
http://www.freeipa.org/page/V3/Integration_testing. I've copied it below
for easier quoting.


= Overview =

Make it possible to write and run multi-host integration tests (such as:
install master & replica, add user on replica, verify it's added on

These tests will be run from continuous integration.
Any developer can also run them manually.

= Use Cases =

== Continuous integration ==

The developer team at Red Hat will run a Jenkins continuous integration
that will run the tests automatically (after each commit if resources are

The CI results will be posted publicly.

== Developer testing ==

Anyone is be able to run integration tests without advanced infrastructure,
only a number of virtual machines to run the tests on is needed.

== Beaker integration ==

The tests will run seamlessly inside [http://beaker-project.org/
Beaker]/[https://fedoraproject.org/wiki/QA/RHTS RHTS].
A special option enables reporting via BeakerLib.

= Non-goals =

A complete testing/continuous integration setup needs some steps that
will not
be included in IPA's test suite:

* Building the code
* VM provisioning
     There are just too many disparate ways to do it; people
     with a virtual datacenter should already have a preferred tool.
     If we come up with something for ourselves we'll have to make too many
     assumptions for it to be useful somewhere else.
* Configuring the basic system, installing the packages
     Again support for this can be added in the future.
     (Release Puppet/Ansible configuration?)

= Design=

The Python package with the IPA test suite is renamed to
<tt>ipatests</tt>, and
packaged for RPM-based systems as <tt>freeipa-tests</tt>.
Eventually the package will be included in Fedora.

Integration tests will be controlled from a single machine, and executed
on a number of "remote" machines that act as servers, replicas, clients,
The controlling machine communicates with the others via the SSH protocol.
(The controlling machine may be the same as one of the "remote" ones.)

Integration tests are included in the main IPA set suite, and configured
environment variables. If the variables are missing, all integration
tests are
If an insufficient number of hosts is configured for a test, the
test will be skipped.

A tool is provided to run installed tests.

The remote machines used for integration testing are required to have
IPA packages installed, firewall opened up, any needed workarounds
applied (RPM
downgrades, SELinux mode,...), and sshd set up to allow root login.
The test runner will connect to these machines, install IPA, perform the
and then uninstall IPA & return the systems to their previous state.

A plugin for integration with BeakerLib is provided.

= Test configuration =

Tests are configured using these environment variables.

== Host configuration ==

:  FQDN of the first IPA server
:  FQDNs of other IPA servers (space-separated)
:  FQDNs of IPA clients (space-separated)
; $MASTER_env2, $REPLICA_env2, $CLIENT_env2, $MASTER_env3, ...
:  can be used for additional domains when needed

DNS needs to be set up so that IP addresses can be obtained for these

== Basic configuration ==

:  Directory for test data on the remote hosts
:  Default: /root/ipatests
:  IP of a DNS forwarder
:  Default:

== Test customization ==

:  IPA domain name
:  Default: taken from $MASTER
:  NIS domain name
:  Default: ipatest
:  NIS domain name
:  Default: ipatest
:  Set to TRUE for IPv6-only connectivity
:  Set to enable test debugging

:  Admin username
:  Default: admin
:  Admin user password
:  Default: Secret123
:  Directory manager DN
:  Default: cn=Directory Manager
:  Directory manager password
:  Default: Secret123

= Supporting tools =

== ipa-test-config ==

This tool reads the configuration variables above and outputs a Bash script
that sets a much more complete set of variables for easy shell-based
or test set-up.

Without arguments, <tt>ipa-test-config</tt> outputs information specific
to the host it is run on. When given a hostname, it prints config for that
With the <tt>--global</tt> flag, it outputs configuration common to all

== ipa-run-tests ==

This tool is a wrapper arount <tt>nosetests</tt> and accepts the same
as Nose.
It loads any additional plugins and runs tests from the system-installed
test suite.

== Other ==

TBD: Additional command-line tools may be provided for tasks such as
IPA in a given topology.

= Implementation =

Test cases are implemented as Nose test classes, with
installation/uninstallation as class setup/teardown.

A BeakerLib plugin is provided that starts/ends Beaker phases for Nose test
contexts and cases, issues a Beaker assertion (rlPass/rlFail) for each test
case, and collects and submits relevant logs.

A separate plugin will be provided to collect logs outside of a Beaker

= Example instructions =
To run the test called
which needs to run with two masters, follow these instructions.

Install the IPA server packages on two machines, and do any preparations
necessary to install IPA (e.g. configure/disable the firewall).

Then, install the <tt>freeipa-tests</tt> package on the machine that
will run
the tests (this may be one of the machines above, or preferably a different
Set MASTER and REPLICA environment variables to the fully qualified
of the two machines prepared earlier.
Also set any other relevant variables listed in
[[#Test configuration|Test configuration]].
You may run <tt>ipa-test-config --global</tt> to verify how the test
configuration will be handled.

The next steps depend on whether the test will run within a BeakerLib
or not.

== With BeakerLib ==

Set up a BeakerLib test (e.g. <tt>rlJournalStart</tt>), and run:

     ipa-run-tests --with-beakerlib --no-skip

The output is somewhat messy as BeakerLib logs are printed to standard
Not that output from external hosts is buffered, so installation may appear

Archive any relevant data (e.g. with <tt>rlJournalPrintText</tt>),
and end the BeakerLib session (<tt>rlJournalEnd</tt>).

== Without BeakerLib ==


     ipa-run-tests test_integration/test_simple_replication.py

As with other Nose tests, no output is shown for test setup (installation)
if nothing goes wrong, so there may be a long time without output.
A summary is printed at the end of the test run.

= Feature Managment =

=== UI ===


=== CLI ===

See above

= Major configuration options and enablement =

See instructions above.

= Replication =


= Updates and Upgrades =


(Note: The tests can theoretically be used to drive hosts with other
of IPA packages to test backward/forward compatibility.)

= Dependencies =

The freeipa-test package will depend on some libraries that are already
for unit tests and other test-related tasks:

* python-nose
* python-paste
* python-coverage
* python-polib

Integration testing brings in a dependency on a library for the SSH

* python-paramiko

Naturally, the new dependencies are not needed in a production environment.

= External Impact =

Cooperation with QE is underway.

= Design author =

[[User:pviktorin|Petr Viktorin]]

Petr Vobornik

Freeipa-devel mailing list

Reply via email to