Hi Lee, I'd really appreciate some help in this test area, since it's crucial for good user experiences.
The current setup is that there are dozens of functional tests of JiBX included under the /build/test directory. The functional tests mostly take the form of a binding applied to some test classes and tested by first unmarshalling a test document, then marshalling it back out, and finally comparing the marshalled document to the original document, ignoring whitespace and other nonessentials. The build.xml file has a long list of these test cases (along with some variations where different bindings are used for unmarshalling and marshalling, along with different test documents). There's only a single JUnit test case at present, org.jibx.runtime.UtilityTest. Now in and of itself I don't think this is a bad thing. I think it'd be difficult to run the functional tests as JUnit test, because of the way JiBX works and the number of different documents involved. But the big thing that gets missed with this approach is that I'm only testing what works, not what reports an error. I'd really like to see more testing of error conditions in the binding compiler to make sure that problems are reported directly to the user during the validation of the binding definition, as opposed to just causing an exception during code generation or (worse) some obscure runtime failure. The binding validation code works with a ValidationContext. When the binding compiler is executed directly it does the validation and then reports the errors in the console output. It's easy to run the same process directly, though - just use the org.jibx.binding.model.BindingElement newValidationContext() method, followed by the validateBinding() method. All problems found in the binding are recorded in the validation context. So what I'd suggest as a starting point in improving the JiBX testing is to write some JUnit tests that try validating a bunch of different bindings with known problems, and make sure that the appropriate problems (warning, error, or fatal) are reported on the correct binding elements (which you can get from the actual org.jibx.binding.model.ValidationProblem item). I'd recommend setting up a set of data classes to be used for the bindings, and just keeping the binding definitions as XML files rather than embedding them in the JUnit code. Some of the errors to be tested are obvious, such as references to classes or methods that don't exist, but you can find others by going through the JiBX documentation and trying things that are supposed to be forbidden. If you find something that's supposed to be forbidden but is not actually checked in the binding validation, either the documentation needs to change or a check needs to be added to the validation. Once you've done some work on this we can discuss ways to make it easier. For instance, it'd probably make sense to embed the expected error information directly in the binding definitions. This could be done using special added elements in the binding, say "<annotation>" elements to mimic the schema approach. That's something I've been thinking about implementing anyway, since I've thought of a few cases where extension information (related to the binding, but not actually part of it) would be useful in the binding definitions. The test work will hopefully get you familiar with both JiBX itself and the binding model validation, which is a key part of JiBX for most of the interesting developments to be done in the future (including the new generation tools I'm working on now, along with things like IDE support for flagging binding errors; I'll even be driving the code generation directly off the binding model when I change over to using ASM for this purpose). - Dennis Dennis M. Sosnoski SOA, Web Services, and XML Training and Consulting http://www.sosnoski.com - http://www.sosnoski.co.nz Seattle, WA +1-425-296-6194 - Wellington, NZ +64-4-298-6117 Shih-gian Lee wrote: > Hello, > > I would like to contribute to JiBX project. I have been using Castor > but came across this binding tool and it is kind of cool. So, I would > like to help out the community. > > I have done what is required on this page - > http://jibx.sourceforge.net/contributing.html > > I would like to know if you still need someone to work on the tasks > listed above? If so, I would like to start writing some JUnit test > cases and get familiar with the code base and hopefully can contribute > to the source after writing some JUnit test cases. I think writing > test cases is a good way for me to start learning the code base. > > Thank you. > > - Lee _______________________________________________ jibx-devs mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/jibx-devs
