You should also read:

http://cwiki.apache.org/confluence/display/qpid/Distributed+Testing

and then try to imagine the results of merging that into the interop test
spec, to come out with a well written distributed testing spec. Something I
am aiming towards, when I can find the time...


On 26/09/2007, Rupert Smith <[EMAIL PROTECTED]> wrote:
>
> Also, we have an internal test instructions page, that explains how to run
> all these tests. If Rob approves, we could put this on the Apache Wiki?
>
> On 26/09/2007, Rupert Smith <[EMAIL PROTECTED]> wrote:
> >
> > http://cwiki.apache.org/qpid/interop-testing-specification.html
> >
> > For the interop tests automation scheme.
> >
> > On 26/09/2007, Rupert Smith <[EMAIL PROTECTED]> wrote:
> > >
> > > Arnuad,
> > >
> > > There is also a README.txt in the integration tests directory to
> > > explain its purpose.
> > > The difference is:
> > >
> > > sys tests           is for testing the Java client + broker together,
> > > as a single system.
> > > integration tests is for testing the Java client, as an AMQP
> > > component, against any broker or other clients.
> > >
> > > 'sys' and 'integration' may not be the ideal names. However, please
> > > don't suggest renaming them, as it will complicate merges.
> > >
> > > Sys tests are run as part of the Maven build, always using in-vm
> > > brokers.
> > >
> > > Integration tests require the independent starting/stopping of a
> > > broker to run through, as well as possibly starting/stopping test clients 
> > > in
> > > other languages. They could be automated, but it is just a bit trickier to
> > > do. It was my original intention to automate the whole interop test cycle
> > > between all client languages and brokers in Qpid, and a scheme for doing 
> > > so
> > > is given in the interop test spec.
> > >
> > > Client tests, are supposed to be pure unit tests for the client code,
> > > but I believe they also test client against an in-vm broker? As such, they
> > > should not be run against a remote broker.
> > >
> > > Perftests could be run as part of a build too, although ideally nont
> > > against an in-vm broker. One of the problems with running perftests to
> > > automatically check performance changes on every build, is that the 
> > > results
> > > of these tests sometimes require 'interpretation'. It would be nice to do
> > > this automatically, for example outputing latency/throughput graphs to a
> > > Wiki page, but this is a whole project in itself. At the moment, I filter
> > > using grep, and open them in a spread sheet.
> > >
> > > I have a macro...
> > >
> > > An interesting test for you to look at might be ImmediateMessageTest.
> > > This one can be run in-vm, against a remote broker, or even distributed
> > > accross many test nodes, all running the exact same test case. This is
> > > currently where I am going with the tests, also with a view to being able 
> > > to
> > > run large pub/sub tests, and adding *lots* more interop tests, all with a
> > > common framework.
> > >
> > > A situation I am very keen to avoid, is divergence of the test code
> > > between different branches. The tests should be the same accross all, to
> > > show that all work in the same way. Its the only sensible way I can think
> > > of, to ensure that when we eventually move from 0.8 to 0.10 that we
> > > carry accross the behaviour from the old to the new.
> > >
> > > These tests should work at the surface of the product, that is through
> > > the JMS or Qpid APIs in the respective languages. In the Java case at 
> > > least,
> > > this should be easy because of JMS, and there should be a sub-class to do
> > > Qpid/AMQP specific stuff (perhaps two one for M2/0.8 and one for 
> > > trunk/0.10
> > > new client).
> > >
> > > Perhaps we could pull some of the test code (integration + perftests +
> > > Immediate/MandatoryMessageTest) out of the current M2/M2.1/trunk 
> > > branchfest,
> > > into a separate top level project? That is something that I would like to
> > > do. Arnaud, you have already ported perftests on trunk to work through 
> > > pure
> > > JMS, so that makes it possible to do this. Thoughts?
> > >
> > > Rupert
> > >
> > > On 26/09/2007, Arnaud Simon < [EMAIL PROTECTED]> wrote:
> > > >
> > > > Hi,
> > > >
> > > > I would like to know more about our testing strategy. So, the unit
> > > > tests
> > > > of the broker and client modules are run on a regular base as they
> > > > are
> > > > part of the maven build process. We will need to update the client
> > > > module tests so that we can configure them to run on a remote
> > > > broker. So
> > > > far so good.
> > > > They are also three other test modules:
> > > > - systest
> > > > - integrationtests
> > > > - perftests
> > > > (Note: the integrationtests module depends on the systests module)
> > > > If this is clear to me what perftests are about it is less clear
> > > > what
> > > > the difference is between the systests and integrationtests modules.
> > > > Can
> > > > somebody explain me? Moreover those tests are not run as part of the
> > > >
> > > > maven build, so my question is when are they run? Shouldn't we run
> > > > them
> > > > as part of the maven build?
> > > >
> > > > Regarding the perftests I really think that we should run them (not
> > > > all
> > > > of them but some) as part of the standard build. This could help us
> > > > detecting if a change has impacted performances.
> > > >
> > > > More generally our testing strategy should be discussed during our
> > > > f2f.
> > > > But until then, I would suggest that we convert the client module
> > > > tests
> > > > for running them against a remote broker and maybe run some
> > > > perftests
> > > > with the build.
> > > >
> > > > Arnaud
> > > >
> > > >
> > >
> >
>

Reply via email to