On Feb 9, 2006, at 9:10 AM, Prasad Kashyap wrote:

I have completed items 1 and 3 of the test strategy outlined above.
With that we now have the geronimo-deployment-plugin in maven 2.

What are items 1 and 3?  Don't see an outline.

-David


I have now begun working on item 2 which is to create a separate
itests sub-project that will act as a catchment of the itests from the
various modules. Now this is what we should look into - have the
itests subproject as a dependency of the Geronimo project, and include
the tests during the integration-phase of the lifecycle in G's build.
Yet keep it separate, and thus be able to run the itests against any
G's distribution binary separately. This would be the best of both
worlds. I'll investigate the possibility of doing so.

Jason, it is at this point that I have begun looking at TestNG. I have
some questions for you.
- Have you used it yourself ?
- How do you think it will fit into the picture I painted above where
the testsuites will be contributed by folks from the the various
modules ?
- Do they all have to use TestNG then ?

We need a framework that can perform system level tests 'coz that is
what itests are at this point.

Cheers
Prasad

On 1/31/06, Jason Dillon <[EMAIL PROTECTED]> wrote:
Anyone thought about using TestNG?

Its xml suite def and grouping support would be nice to define these itest suites.

--jason


-----Original Message-----
From: David Blevins <[EMAIL PROTECTED]>
Date: Mon, 30 Jan 2006 20:41:39
To:[email protected]
Subject: Re: Test strategy


On Jan 30, 2006, at 7:43 PM, Prasad Kashyap wrote:
I would like to solicit the views of others too like you and get their
views and opinions.

Great.  I poked some other people in irc to get involved as well.
Here is hoping.... :)  More input is going to make for a better
community supported solution.

- The tests can all be run or only a subset of suites can be run
quite easily.

When you say "easily" do you mean by virtue of them being separate or
some other idea?

When I said we could run a subset of tests "easily" when the itests
are run "on" assemblies, I meant that it would be relatively easier to
selectively include/exclude tests (by categories) than when compared
to doing the same if the itests are embedded with their modules.

I think I follow what you mean.  I guess the haze is in what you mean
by "embedded."

As I'm not sure, I will just state in more specific terms that I have
never argued for putting the actual testing source code inside any
particular assembly module -- this code is big, bulky, has a lot of
deps itself and some pretty specific packaging needs.  I have argued
for simply running the    big test suite during the lifecycle of an
assembly.

Not sure if that is coming across well, or if it's being assumed that
since I've recommended running the integration tests on an assembly,
during the integration test phase of it's lifecycle, that the test
source of course must be in the assembly as well.  I believe we agree
on keeping the test source in modules separate and organized.

- The tests will be grouped logically into suites or categories.
Each
individual test will fall in one of these suites.

Not sure of your usage of the word "test" in this sentence. In junit
terms is that TestSuite, TestCase, or individual test method.  Can
you clarify?

I meant TestCase. For. eg, we could have a TestSuite called, "System"
which will have test cases from many different modules. We could say
that these form the core set of tests. If the System suite passed, we
have somewhat of a stable binary with the caveat that all functions
may not work.

That is much clearer to me, thanks.  I do like the sort of parent
module name you came up with of "System" too -- it's a keeper.

At this point, I will note that it is possible put a single
SuperServerSystemTest.java TestSuite in the SuperServer assemblies
module (for example) that adds only the tests from the many different
test modules that apply to that assembly.

In summary, the cons of running the itests "on" assemblies is that it goes against the m2 lifecycle. And even though integration-test is an
m2 lifecycle phase, we are not exploiting it's usage but calling it
explicity again. The same has to be done for the deploy phase too.

You captured that quite well.

It occurred to me while writing a sentence above that there is no
clear distinction between "on" and "during".  In all cases you are
quite literally running the tests on the assembly.  The better
distinction is whether or not you wish to run them "during" it's
lifecycle or "after" in the lifecycle of a separate module.

-David



Reply via email to