> We should have a set of functionality and performance
> tests before we get into (re)architecture esp. if the
> changes are ment to improve performance.

I agree that we should have tests for performance and adherence to
specifications, but what are you proposing with respect to getting James 2.1
released?  Are you proposing that all work on James cease until there is a
new test suite authored, or just that we prepare unit tests at the beginning
of work for James v3 so that we can do regression and performance testing as
changes in code and architecture occur?

Personally, I agree with the latter, but I have some concern about the
former.  The current code is demonstrably better than the current release,
and the plan was to get out a stable replacement for 2.0a3, fork it as a
maintainable branch, and then look at future work.  The scheduler problem,
and the benefit of switching to the watchdog code, is demonstrable.  Do you
read the James User list as well as the James Developer list?

> Checking in JUnit based test suite for POP3 protocol in a short while.
> Planning to add SMTP and NNTP too. IMAP would be an excellent addition.

You might also want to take a look at Russell Coker's POSTAL performance
tests (http://www.coker.com.au/postal/).  External tests don't need to be
written in Java to be useful.

> Currently it exercises POP3 Protocol and collects performance stats but it
> does not check for correctness.

There are plenty of available performance metrics, so IMO we need
correctness testing more than we need performance testing.

> Would be good get our testing strategy to a better level.

Is this something you have the time to do, and take responsibility for?  Do
you have time to take a lead role and work with other volunteers to
coordinate a test plan and suite?

[Nathan]
> > using a profiling tool to see how much time is actually being
> > wasted by configuring each Handler instead of more tightly
> > coupling them to the server listener.

This one is pretty a priori.  The plain fact is that a handful of simple
setX methods is going to be faster than even just the string lookup part of
pulling named attributes from a DOM.  Whether or not the performance savings
is significant is a different matter.  My guess is more than likely not,
although there are also savings related to transient objects and garbage
collection, because the operation of the handler will be relatively large
compared to its initialization.  HOWEVER, there is more to architecture than
JUST performance.  :-)

[Nathan]
> > JMeter doesn't seem to support email server load testing
> > - is there anything better than this?

See Russell Coker's postal (above).

[Nathan]
> > Again, I'd be more than happy to help in testing because this
> > significantly affects what I'm doing in IMAP, and I'm going to have to
> > perform this kind of testing on IMAP due to its complexity and higher
> > overhead.

Again, more important than performance is adherence to the RFCs.  If, as is
appropriate, there is going to be effort put into testing, I think that the
first thing for folks to do is learn the RFCs (or at least OF them) and
write test specs to ensure that the code is CORRECT.

> I was initially thinking of using jython and writing JUnit tests with it.

Has anyone integrated JUnit with the Avalon lifecycle?  Remember that these
things run in a context that is not obvious to JUnit.  I've run google
looking for references, but no luck.

> Velocity was the preffered template language choice.

Must have been before my time.  I dislike Velocity intensely.  There are any
number of other choices.

By the way, Harmeet, if you have some spare time on your hands, a bridge for
writing matchers and mailets in jython (or any BSF supported scripting
language) would be slick.  :-)

        --- Noel


--
To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>

Reply via email to