Myrna van Lunteren wrote:
On 2/2/06, *Andreas Korneliussen* <[EMAIL PROTECTED]
<mailto:[EMAIL PROTECTED]>> wrote:
I think the work currently done on DERBY-874 was mainly to improve the
DerbyJUnitTest's JavaDoc, and to log exceptions. So I would not throw
that away.
However I do propose to change DerbyJUnitTest to move out everything
about configuration into a separate class.
cool. thx for the reply.
I now noticed that the wiki says all suggestions are to be put on the
list, so here I go rather than plopping them directly on the wiki:
Great feature list.
What I think is important is that we provide a common mechanism to get
access to the configuration of the test.
If someone then need to do some framework/version specific logic, they
have a mechanism to get i.e the framework enum, and can write logic
based on that.
I think the following could qualify as 'more details' to the jvm,
framework, version specific logic:
1. jvm-specific:
1.1.
not all parameters are consistent for all jvms. Think here of jit
settings / configurations, memory settings. For j2ME testing, that jvm
doesn't come with a DriverManager implementation, so already from the
start you know you have to go with DataSources.
So I guess what you are saying is that if the test framework provides a
common mechanism to give a Connection to a derby database, it should go
through a DataSource, instead of using DriverManager ?
1.2. Different versions of a vendor's jvm also may have slightly
different implementations resulting in slightly different behavior -
e.g. of the order of rows, for instance, or rounding of certain numeric
values.
1.3. Some behavior will be only supported by later versions...
2. version specific.
This really falls back to the discussion on ...(can't find right now,
raman's working on it, I think)... re mixed version testing. I think the
conclusion was that the harness needs a way to cope with results from
newer servers and clients - if they differ from results with same
versions as the harness.
3. framework specific
The tests needs to be able to cope with the following
3.1. different client drivers (e.g. DerbyClient, IBM Universal JDBC Driver)
3.2. server may need to be started by harness, or not
3.3. server may be on the localhost, or be running on a remote machine.
certain individual tests may not be able to run in with this
mechanism...
3.4 should be able to have the harness start the server in a differrent jvm.
4. one thing the current harness has no way of doing is to cope with
different OSs. For instance, sometimes there are slight differences in
behaviour of the same jvm version on different OSs. Like slightly
different error messages (although this may be irrelevant if we're not
gathering & comparing output).
I think the following details would be useful (in addition to the above
and item 1 on the wiki):
- there must be a way to skip individual tests without causing an error
but with an informational message for certain configurations. eg.
absence of optional jars (specifically thinking of db2jcc.jar),
unsupported functionality with older jvms..., or when there is a problem
that's being worked on, or that's been referred to some other
organization ( e.g. in the case of jvm bugs, OS bugs...).
- some way to compare runtimestatistics.
Currently this is done by comparing the output, I have a hard time
thinking of another mechanism.
I am not sure which runtimestatistics you think of. Which output ?
Output from a SQL statement ?
Thanks
--Andreas