On Fri, 10 Aug 2001 13:41:24 Sam Ruby wrote:
> Some background, so that perhaps you can understand both the depths of my
> insanity and and what I am looking for.  As Arved mentioned, I am on the
> XML PMC.  I am also on the Jakarta PMC.  As I mentioned, one thing that
> drives me batty is projects not talking.  I am not singling out xml-fop.
> If you take a moment to check
> http://jakarta.apache.org/builds/gump/latest/, you will see more than a
> few
> projects than I am more or less following.
> 
> If there are any automated tests available, I will gladly run them daily
> against the absolute latest versions of Xerces, Xalan, and other
> dependencies, and publish the results.  This often has the affect of
> identifying integration problems early, which generally makes the culprit
> easier to identify and resolve.
> 
> An example of that occurred this morning.  The build of fop itself died
> with null pointer exceptions in codegen.  While not designed as a
> regression test, codegen certainly excersizes some functionallity in
> Xalan.
> The Xalan team has found this valuable information in the past.
> 
> So, if you look at the number of projects, that's a lot of FM to RTFM. 
> But
> if you know of any existing automated tests and can point me in somewhat
> the right direction, I will try to figure out the rest.  From what I can
> tell, the page you pointed me to tells me how to get a new test added.

I understand that you may have a lot to do.
That is why I write pages like that, so people can read them (though I
often wonder why) and get the information quickly.

So to quote from that page:
"To setup the testing the developer must place a reference fop.jar in the
"<cvs_repository>/test/reference/" directory. This jar will be dynamically
loaded to create the reference output."

and ...

"The tests are stored in the "<cvs_repository>/test" directory. 

You can run the tests by specifying the build target "test" ie: 
build.sh test 

This will then compare the current code in the local src directory to a
specified release of FOP. Any differences between the current code and the
output from the reference version will be reported. If the test previously
passed then the test run will have failed."

If that is not enough information or is otherwise not clear then it would
be helpful to say so.
As I said, the testing is in the early stages and currently doesn't provide
that much information.


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to