On 28 Mar 2007, at 0:19, David Van Couvering wrote:

I think everything is covered, except for the question of testing. I recognize you just wanted to get something started and some feedback. But before anything gets checked in, I would need to see some validation that it works, even partially works.

This is for sure, I was not hoping for 'inclusion' or anything. I just wanted some feedbacks in order to avoid losing tons of time in things obvious to you derbyers and absolutely not obvious to me (this is what you are perfectly done BTW). Once I have what I need, I'll prepare a nicer package along with test cases for submission.

Thinking about it further, rather than building new tests, I guess all you need to exercise this code is to modify the modules.properties file to use this storage factory, and then run some subset of the Derby unit tests.

Good to know, so I need to check module.properties to enable my storage instead of the default ons

I don't know the storage area very well, but I did find the following:

java/testing/org/apache/derbyTesting/unitTests/store

I noticed them as well, but didn't find the time to try them yet. I will soon.

I have never run these tests, but if you ask on the list with a new subject header you should get some help. Also, any help you may need on configuring Derby to use your storage factory. What is unclear to me is how to automate running in a Java Web Start environment, vs. in a standard no-sandbox VM. Do you know how to do this?

This is another of my problem. I've even asked on the sun JNLP forums and JUnit ML regarding this (you can read about it here: http:// tech.groups.yahoo.com/group/junit/message/19132 ), and there seems there is _NO_ way to automate tests regarding JNLP APIs.

This is because even if the JNLP interfaces are available in every JVM in the jnlp.jar, the implementation classes doesn't get loaded unless you run your program from the javaws executable, which requires a JNLP descriptor file (which in turns requires you to put the file in a proper jar archive on a working webserver, etc...).

This means each time you even do a single change to a line of code you have to rebuild the whole package and test it outside the IDE by launching the javaws executable. I have YET to find a way for automating tests or debugging jnlp-api applications: even on the JUnit ML no luck, they just suggested to stub out the PersistenceService API, which is possible would 'defy' most of the testing since I would also want to test my implementation against the persistent service for real and not in theory :)

I've even tried a dirty trick, such as manually including the javaws.jar which cointans the com.sun.jnlp.* implementation classes, and manually initializing the registering the services, but it all fails horribly because of many missing tidbits (such as needed set system properties and the like... all this get probably done by the javaws executable itself or by some other yet to be found class).

Maybe by digging into the code for the needed time one can find a way to properly 'force' the implementation classes to get loaded manually, but I didn't have the time to try this for real.

I also want to see at least some subset of our higher-level system tests run under the JNLP storage factory configuration. Most of the times the database created is quite small, so I don't think it should be an issue. In particular, I'm be curious to see what security bugs/issues we hit as Derby tries to do various things with the system under such a constrained SecurityManager as JWS.

I'm curious of this as well, let's hope derby doesn't go too far and it's JWS sandbox-friendly :) If it isn't, that will be my next step: to make it! :P

In terms of running derbyall, a general principle is that no checkin takes place without running derbyall. However, I do recognize that in your case this is not necessarily applicaable since your code will never be hit by running the existing regression suites.

I will try to run the most tests possible, but I have yet to find an easy way for doing this. Testing JNLP is only possible outside an IDE in a full JNLP deployed application package.

What I would like to see is that one you have identified new tests or new configurations of old tests that exercise your storage service, then these new tests/configurations should be added to derbyall. I don't think this is required for initial checkin, though.

If I get the time, I would also love to provide a full test suite for exercising the whole StorageFactory/StorageFile interfaces in an implementation-agnostic way, and this would mean we would have an easy way to fully test each (old and new) storage implementations just by running this suite to check if the storage is fully specifications and API-compliant.

Currently I don't have enough derby storage knowledge to implement such an extensive test suite, and it's not my main focus. Maybe later, once JNLP is done :)


Make sense?

It does! :)

To sum it up: I'll try my best to test my storageFactory with the tools I find or I have to develop for, and will report back once a minimal testing is done. Any help here is appreciated of course :)

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to