Hi Cameron,
Sorry this is being such a pain...
I took a look at the code for the transcoder tests. It doesn't look like they currently support accepted-variation files. This should be added.
It is really unfortunate the svggen tests are failing, short of sorting the tags I can't think of a good way to address this issue (I suppose adding support for accepted variation could be one route, but for some reason it doesn't seem as good to me).
I have always gotten the permission test failures. I don't know if this was a mistake be whoever created them or what. At this point it probably makes sense to modify the tests to expect the thread stop permission.
The Performance tests tend to be a bit twitchy, it does actually test the performance of your machine and scale the expected result by the relative value, but as with most performance testing it can get affected by other stuff on the system.
The text-i18n has a text char where it should probably have only whitespace, it is easy to fix, replace "</text>><text" with: "</text><text".
I am nervous about the JFrame memory leak errors. This probably indicates that the JFrame object isn't be GCed. This might be normal on X-Windows I don't know. If so I suppose we could remove the check on the JFrame (or you could learn to 'accept' it for your case. I know that I had to do some 'hacking' of swing to get to a point where the JFrame would go to GC on windows (it's hard to tell swing/awt that a window is _really_ truly done). Lately two of the GC tests have been failing for me _most_ of the time (the object's I get are GVT, and 't1', 't2') - the GC is another twitchy part of the JVM the test works really hard to get the JVM to really do a complete GC run but it isn't a 100% thing.
In general for regular commits I live with a _few_ 'known' failures for my system. For releases I generally try to eliminate all failures (either by fixing the test or the references or the bug that is causing the failure, which is not that common).
Looking at your output you are probably OK to deliver (I'm a bit nervous about the memory leak - does the clean 1.5.1 have the same problem?), but moving forward we should add support for accepted variations for the transcoder and svggen because 160 errors is too many to examine closely.
I may take a look at the memory leak test again and see if there is anything to be done there.
One of the real advantages of regard is that even if there are a few false positives it let's you know when you have 'unexpected' side effects.
Cameron McCormack wrote:
Ok, I'm still trying to get regard to have no failed tests on the latest Batik CVS. I've got all the rendering accuracy tests fixed up by copying the different images to the accepted-variation directories. (The actual differences seemed to be either because of fonts or slight differences in anti-aliasing or colours.)
I'm left now with 130 failed tests. Many of these are from the transcoder unit testing. Is there a way I can specify accepted-variations for these tests like with the rendering accuracy tests?
Many of the svggen tests are failing because the serialised SVG documents have attributes in a different order from the reference file. What should I do about these?
There are a few memory leak tests and some permissions tests which fail. The DoubleStringPerformanceTest claims an unexpected improvement, just because I have a faster machine than the reference machine the tests were performed on, I guess. The reference score is specified in the test XML file though. How could I make this test pass without modifying that file?
Finally, one of the SVG BE tests fails, claiming that the SVG file (text-i18n-BE-09.svg) doesn't validate against the DTD.
The test report is here:
http://mcc.id.au/~cam/2004.07.01-20h51m35s/html/regardReport.html
Help to make these tests pass appreciated!
Thanks,
Cameron
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
