Hi Vincent,
Thanks for the update. Ideally we should investigate the remaining
failures before the release
Thanks,
Chris
On 07/08/2014 19:53, Vincent Hennebert wrote:
Some update on this.
More than half of the failing tests (369 exactly), actually correspond
to slight rendering differences between the reference image and what my
platform produces (a lot of them due to different fonts being used, but
not all).
I could update the reference image, but then other people on other
platforms may still get different results.
I prefer to use the available ‘accepted variation’ mechanism: Batik
generates a difference image between the reference and the actual. I can
decide that the difference is small enough that it’s acceptable, so
I copy the png from the candidate-variation folder to
accepted-variation.
I can even add a platform-specific suffix if I wish to
(_java6-linux in my case, the other one available is _java5-osx which
probably is outdated). That would be helpful for people who work on
various different platforms, but I found out that not all tests offer
that possibility, so in the end it’s of limited use.
I attached to this message the list of tests that can safely be marked
as accepted.
There are 180 remaining tests that fail. I put the report on my public
space on Apache:
http://people.apache.org/~vhennebert/batik-test-suite/testReport/html/regardReport.html
Most of them belong to one of the 8 following categories:
• TranscoderException
• NullPointerException
• Gradients not being rendered
• Roundtrip failing: generating a memory representation of the SVG and
rendering it back to an SVG file
• Permissions issues
• Memory leaks
• WMF conversion issues
• Some tests that don't honour an accepted-variation
So the next step would be to investigate what’s going wrong in each
category and fix the issue.
But that’s another story...
Vincent
On 28/07/14 23:43, Vincent Hennebert wrote:
Hi,
I thought I would post here my findings about running the Batik test
suite.
The test suite can be run on the command line by using the ‘regard’
target from build.xml. It produces an HTML report that lists the failing
test cases.
To begin with, the Basic Effectivity test suite from the SVG Working
Group will probably have to be removed. That’s because it’s not shipped
with Batik (the code is pointing to a directory outside the project
root), and I can no longer find any trace of it on the Internet. Unless
I missed something, of course.
The rest of the test suite runs but results into many failures (nearly
500). Most of the tests probably just need to be updated (some of them
are based on the comparison of a bitmap rendering against a reference
image). But some of them may well point at regressions and I can see
quite a few exceptions. Do we want to take the risk of releasing Batik
with a failing test suite? Probably not.
This test suite looks like it was created before JUnit became the de
facto standard for testing. It looks rather powerful and elegantly
designed but, needless to say, it doesn’t quite have the same tooling
support as JUnit (IDE integration, CI, code coverage, etc.). Also, it
opens up several windows to render some SVG tests, which is quite
disrupting. We might want to convert it to JUnit and reduce it to
a subset that can run in the background.
But the most urgent probably is to go through the test results and
investigate the failing ones.
Vincent
---------------------------------------------------------------------
To unsubscribe, e-mail: batik-dev-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: batik-dev-h...@xmlgraphics.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: batik-dev-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: batik-dev-h...@xmlgraphics.apache.org