curcuru 01/03/01 16:50:13
Modified: test/java/xdocs/sources/tests design.xml getstarted.xml
overview.xml run.xml
Log:
Basic documentation updates; brief description of new Testlets stuff
Revision Changes Path
1.3 +19 -6 xml-xalan/test/java/xdocs/sources/tests/design.xml
Index: design.xml
===================================================================
RCS file: /home/cvs/xml-xalan/test/java/xdocs/sources/tests/design.xml,v
retrieving revision 1.2
retrieving revision 1.3
diff -u -r1.2 -r1.3
--- design.xml 2001/01/08 22:57:37 1.2
+++ design.xml 2001/03/02 00:50:08 1.3
@@ -25,7 +25,7 @@
output file with a .out extension.</item>
<label>What kinds of tests does Xalan have?</label>
<item>There are several different ways to categorize the
- tests currently used in Xalan: API tests, specific tests
+ tests currently used in Xalan: API tests and testlets, specific tests
for detailed areas of the API in Xalan; Conformance Tests,
with stylesheets in the tests\conf directory that each test
conformance with a specific part of the XSLT spec, and are
@@ -46,7 +46,7 @@
something failed in an unexpected way, and AMBG or ambiguous tests,
where the test appears to have completed but the output results
haven't been verified to be correct yet.
- <link anchor="overview-tests-results">See full description
below.</link></item>
+ <link anchor="overview-tests-results">See a full description of test
results.</link></item>
<label>How are test results stored/displayed?</label>
<item>Xalan tests all use
<jump href="apidocs/org/apache/qetest/Reporter.html">Reporter</jump>s
and
@@ -58,9 +58,10 @@
determines where it will produce it's MyTestResults.xml file, which
are the complete report of what the test did, as saved to disk by
it's XMLFileLogger. You can
- then use <link idref="run"
anchor="how-to-view-results">viewResults.bat</link>
+ then use <link idref="run"
anchor="how-to-view-results">viewResults.xsl</link>
to pretty-print the results into a MyTestResults.html
- file that you can view in your browser.
+ file that you can view in your browser. We are working on other
+ stylesheets to output results in different formats.
</item>
<label>What are your file/test naming conventions?</label>
<item>See the sections below for <link
anchor="standards-api-tests">API test naming</link> and
@@ -152,6 +153,16 @@
<item>As in 'ConformanceTest', 'PerformanceTest', etc.: a single,
automated test file designed to be run from the command line or
from a testing harness.</item>
+ <label>*Testlet.java/.class</label>
+ <item>As in '<jump
href="apidocs/org/apache/qetest/xsl/StylesheetTestlet.html">StylesheetTestlet</jump>',
'PerformanceTestlet', etc.: a single,
+ automated testlet designed to be run from the command line or
+ from a testing harness. Testlets are generally focused on one
+ or a very few test points, and usually are data-driven.</item>
+ <label>*Datalet.java/.class</label>
+ <item>As in '<jump
href="apidocs/org/apache/qetest/xsl/StylesheetDatalet.html">StylesheetDatalet</jump>':
a single set of test data for
+ a Testlet to execute. Separating a specific set of data from the
+ testing algorithim to use with the data makes it easy to write
+ and run large sets of data-driven tests.</item>
<label>*APITest.java/.class</label>
<item>As in 'TransformerAPITest', etc.: a single,
automated test file designed to be run from the command line or
@@ -174,8 +185,9 @@
Thus we can hook a LoggingErrorHandler up to a Transformer, run a
stylesheet with known errors through it, and then go back and validate
that the Transformer logged the appropriate errors with this
service.</item>
- <label></label>
- <item></item>
+ <label>QetestUtils.java/.class</label>
+ <item>A simple static utility class with a few general-purpose
+ utility methods for testing.</item>
</gloss>
<p>Please: if you plan to submit Java API tests, use the existing
framework
as <link idref="submit" anchor="write-API-tests">described</link>.</p>
@@ -201,6 +213,7 @@
<code>cd xml-xalan\test</code><br/>
<code>ContribTest.bat -category foo</code><br/>
</p>
+
</s2>
<anchor name="testing-links"/>
1.3 +9 -2 xml-xalan/test/java/xdocs/sources/tests/getstarted.xml
Index: getstarted.xml
===================================================================
RCS file: /home/cvs/xml-xalan/test/java/xdocs/sources/tests/getstarted.xml,v
retrieving revision 1.2
retrieving revision 1.3
diff -u -r1.2 -r1.3
--- getstarted.xml 2000/12/06 20:51:54 1.2
+++ getstarted.xml 2001/03/02 00:50:09 1.3
@@ -32,7 +32,8 @@
<s2 title="Building the Tests">
<p>This builds both the test harness/framework/etc. and the specific
Java API test files.
It works similarly for DOS/Windows or your flavor of UNIX; in most cases
*.sh files are
- provided to match the *.bat files used here.</p>
+ provided to match the *.bat files used here. Plans are underway to also
provide an
+ Ant-based way to run tests, which is already cross-platform.</p>
<p>1: Use CVS to check out the whole xml-xalan repository locally.
<br/><code>e:\builds\xml-xalan</code></p>
<p>2: Set an environment variable JARDIR to point to directory where you
have all applicable jars.
@@ -53,7 +54,13 @@
<br/>Xalan-J 2.x:
<br/><code>build.bat package.xalan2</code> (<ref>or package.trax</ref>)
</p>
- <p>This will create <code>build\testxsl.jar</code>, which you should
manually copy into JARDIR before you run any tests.</p>
+ <p>This will create <code>build\testxsl.jar</code>, which you should
manually copy
+ into JARDIR before you <link idref="run" anchor="how-to-run">run any
tests</link>.</p>
+ <note>The use of JARDIR is not required; you're free to manage the
classpath
+ yourself, or the test build file will assume it should use the Xalan-J
2.0
+ directories by default. I've found that simply putting all the needed
jars in
+ a JARDIR makes it simple to switch between testing different
builds.</note>
+
<p>Note that ProcessorWrapper subclasses for XT and SAXON are currently
checked in
both as .java source files and as precompiled .class files - the .class
files are
merely copied into the jar by default, so you don't need the other
processors
1.4 +46 -40 xml-xalan/test/java/xdocs/sources/tests/overview.xml
Index: overview.xml
===================================================================
RCS file: /home/cvs/xml-xalan/test/java/xdocs/sources/tests/overview.xml,v
retrieving revision 1.3
retrieving revision 1.4
diff -u -r1.3 -r1.4
--- overview.xml 2000/12/06 20:51:56 1.3
+++ overview.xml 2001/03/02 00:50:09 1.4
@@ -73,38 +73,30 @@
<p>Java Test Drivers</p>
<p>A Java Test Driver executes a test for each xml/xsl file pair in
the specified directory tree or each pair in the specified fileList.
-For each test, the driver generates an output file which may be
-compared to the corresponding file in the "gold" tree. The Test
-Drivers supply the code to determine which files to operate on
-and the order to do those operations (including logging performance
-and other data out).</p>
-<p>The Test Drivers rely on ProcessorWrapper
+For each test, the driver iterates over the tree or list of files
+and asks a Testlet to execute a test on each one.
+The best example is <jump
href="apidocs/org/apache/qetest/xsl/StylesheetTestletDriver.html">StylesheetTestletDriver</jump></p>
+<p>The Test Drivers rely on various Testlet implementations
+to define the actual testing algorithim to apply to each xml.xsl
+file pair. This defines any options to be used when processing the
+file as well as logging out information about the test in progress.
+Examples include
+<jump
href="apidocs/org/apache/qetest/xsl/StylesheetTestlet.html">StylesheetTestlet</jump>
and
+<jump
href="apidocs/org/apache/qetest/xsl/PerformanceTestlet.html">PerformanceTestlet</jump></p>
+<p>The Testlets rely on ProcessorWrapper
subclasses to perform the actual test of processing or transformation
of the xml.xsl file pair into the output file. We can then plug
in different ProcessorWrapper "flavors" easily. Different
ProcessorWrappers can process or transform in various ways, like
using DOM trees, SAX events, or input/output streams.</p>
+<p>The three levels of iteration, test algorithim, and
+processor flavor are all independently changeable, so we can
+easily try out different kinds of tests.</p>
<gloss>
-<label>org.apache.qetest.xsl.ConformanceTest</label>
-<item>basic test driver, either takes an
-inputDir to iterate over (using ConformanceDirRules/ConformanceFileRules),
-or an explicit fileList. Processes all files using a specific
--flavor of a <jump
href="apidocs/org/apache/qetest/xslwrapper/ProcessorWrapper.html">ProcessorWrapper</jump>,
so identical test runs can be done
-using different processors (e.g., -flavor xalan = XalanWrapper = Xalan-J
1.x;
--flavor trax = TraxWrapper = Trax interface using streams by default;
--flavor trax.d2d = TraxWrapper = Trax interface using DOMs)
-<br/>Actually, 'ConformanceTest' is a bad name - this is a generic
stylesheet
-test driver that can be used to run using any matching xml source
-and xsl stylesheet file names, as specified in the ConformanceFileRules,
-not just conformance tests.
-<br/>Suggestions for alternate names welcomed!
-'StylesheetTestDriver' perhaps?</item>
-<label>org.apache.qetest.xsl.PerformanceTest</label><item>essentially the
same as ConformanceTest,
-but provides additional timing/memory output, as well as an -iterations
-parameter to iterate over each file a bunch of times to get average timing
data
-</item>
<label>org.apache.qetest.xsl.<link idref="run"
anchor="how-to-run-c">CConformanceTest</link></label>
-<item>essentially the same as ConformanceTest, but for Xalan-C.</item>
+<item>essentially the same as ConformanceTest, but for Xalan-C. I plan to
+make this use the Testlet model soon to simplify this model, and
+to provide better support for differing command lines.</item>
</gloss>
<p>Java API tests for the TRAX (or javax.xml.transform) interface, that
@@ -114,7 +106,11 @@
<label>REPLACE_template_for_new_tests.java</label>
<item>a template for creating new TRAX API tests, see <link idref="submit"
anchor="write-API-tests">Submitting New Tests</link></item>
<label>LoggingErrorListener.java</label>
-<item><ref>utility:</ref> wraps javax.xml.transform.ErrorListener, and logs
info</item>
+<item><ref>utility:</ref> wraps javax.xml.transform.ErrorListener, and logs
info;
+this class also supports setting expected errors to trap, and it will call
+logger.checkPass/checkFail for you when it gets an expected or unexpected
event.
+This allows us to write very detailed negative tests and have them be
+fully automated.</item>
<label>LoggingURIResolver.java</label>
<item><ref>utility:</ref> wraps javax.xml.transform.URIResolver, and logs
info</item>
<label>ExamplesTest.java</label>
@@ -127,11 +123,21 @@
<item>API coverage tests for javax.xml.transform.TransformerFactory</item>
<label>TemplatesAPITest.java</label>
<item>API coverage tests for javax.xml.transform.Templates</item>
-<label>ResultAPITest.java</label>
-<item>API test for Result class - may be obsolete, should
-have separate tests for SAXResult, DOMResult, StreamResult</item>
-<label>ProcessorAPITest.java</label>
-<item>API test: obsolete: from a previous version of TRAX</item>
+
+<label>EmbeddedStylesheetTest.java</label>
+<item>Testing various types and kinds of stylesheets embedded with the
xml-stylesheet PI</item>
+<label>ErrorListenerAPITest.java</label>
+<item>API Coverage test for ErrorListener</item>
+<label>ErrorListenerTest.java</label>
+<item>Functionality test of error listeners when using illegal
stylesheets</item>
+<label>OutputPropertiesTest.java</label>
+<item>Various tests of programmatic access and changing of output
properties</item>
+<label>SystemIdImpInclTest.java</label>
+<item>Testing various forms of URLs in setSystemID with imported and
included stylesheets</item>
+<label>SystemIdTest.java</label>
+<item>Testing various forms of URLs in setSystemID</item>
+
+
<label>TestThreads.java</label>
<item>MANUALLY executed test for running multiple threads
and transforming multiple stylesheets simultaneously.</item>
@@ -140,27 +146,27 @@
<p>All in subpackages of: org.apache.qetest.trax</p>
<gloss>
<label>stream.StreamSourceAPITest.java</label>
-<item>API coverage tests for javax.xml.transform.stream.StreamSource (mostly
done)</item>
+<item>API coverage tests for javax.xml.transform.stream.StreamSource</item>
<label>stream.StreamResultAPITest.java</label>
-<item>API coverage tests for javax.xml.transform.stream.StreamResult (mostly
done)</item>
+<item>API coverage tests for javax.xml.transform.stream.StreamResult</item>
<label>dom.DOMSourceAPITest.java</label>
-<item>API coverage tests for javax.xml.transform.dom.DOMSource (mostly
done)</item>
+<item>API coverage tests for javax.xml.transform.dom.DOMSource</item>
<label>dom.DOMResultAPITest.java</label>
-<item>API coverage tests for javax.xml.transform.dom.DOMResult (mostly
done)</item>
+<item>API coverage tests for javax.xml.transform.dom.DOMResult</item>
<label>dom.DOMLocatorAPITest.java</label>
<item>API coverage tests for javax.xml.transform.dom.DOMLocator
(@todo)</item>
<label>sax.SAXSourceAPITest.java (to be done)</label>
-<item>API coverage tests for javax.xml.transform.sax.SAXSource (@todo)</item>
+<item>API coverage tests for javax.xml.transform.sax.SAXSource</item>
<label>sax.SAXResultAPITest.java (to be done)</label>
-<item>API coverage tests for javax.xml.transform.sax.SAXResult (@todo)</item>
+<item>API coverage tests for javax.xml.transform.sax.SAXResult</item>
<label>sax.SAXTransformerFactoryAPITest.java (to be done)</label>
-<item>API coverage tests for javax.xml.transform.sax.SAXTransformerFactory
(@todo)</item>
+<item>API coverage tests for
javax.xml.transform.sax.SAXTransformerFactory</item>
<label>sax.TemplatesHandlerAPITest.java (to be done)</label>
-<item>API coverage tests for javax.xml.transform.sax.TemplatesHandler
(@todo)</item>
+<item>API coverage tests for javax.xml.transform.sax.TemplatesHandler</item>
<label>sax.TransformerHandlerAPITest.java (to be done)</label>
-<item>API coverage tests for javax.xml.transform.sax.TransformerHandler
(@todo)</item>
+<item>API coverage tests for
javax.xml.transform.sax.TransformerHandler</item>
</gloss>
<p>Java API tests for Xalan-J 1.x.</p>
1.4 +26 -24 xml-xalan/test/java/xdocs/sources/tests/run.xml
Index: run.xml
===================================================================
RCS file: /home/cvs/xml-xalan/test/java/xdocs/sources/tests/run.xml,v
retrieving revision 1.3
retrieving revision 1.4
diff -u -r1.3 -r1.4
--- run.xml 2001/01/08 22:57:39 1.3
+++ run.xml 2001/03/02 00:50:10 1.4
@@ -17,23 +17,23 @@
from your application, or from
<jump
href="apidocs/org/apache/qetest/xsl/XSLTestHarness.html">XSLTestHarness</jump>.
There really isn't any magic to them: you can just set your classpath
and
- execute java.exe to run them. However we have provided a couple of more
+ execute java.exe to run them; some Tests and Testlets currently provide
defaults
+ for their inputs, so you can run them without any setup at all.
+ However we have provided a couple of more
convenient ways to run the most common tests:</p>
<p>1: <link idref="getstarted" anchor="how-to-build">Build a fresh copy
of testxsl.jar.</link>
<br/></p>
<p>2: Set the JARDIR environment variable, and put
<code>testxsl.jar</code> and the other required JAR files in the JARDIR
directory.<br/></p>
<note>The tests will now default to using Xalan-J 2.x if JARDIR is not
set,
presuming that you have the tests in the same tree as xml-xalan\java
(just like
- you get if you checkout the tree).</note>
+ you get if you checkout the tree); you can use a JARDIR or not if you
find it convenient.</note>
<p>3: cd xml-xalan\test<br/></p>
<p>4: Run any of the convenience batch files (see below) or run java.exe
with the desired test class.<br/></p>
<p>
- <code>contribtest.bat [<link
anchor="test-options">options</link>]</code>
- <br/>(runs ConformanceTest driver over tests\contrib test
tree)<br/><br/>
- <code>ConformanceTest.bat [<link
anchor="test-options">options</link>]</code>
- <br/>(runs ConformanceTest driver over tests\conf test
tree)<br/><br/>
- <code>PerformanceTest.bat [<link
anchor="test-options">options</link>]</code>
- <br/>(runs PerformanceTest driver over tests\perf test
tree)<br/><br/>
+ <code>conf.bat [<link anchor="test-options">options</link>]</code>
+ <br/>(runs StylesheetTestletDriver over tests\conf test tree using
the default StylesheetTestlet)<br/><br/>
+ <code>perf.bat [<link anchor="test-options">options</link>]</code>
+ <br/>(runs StylesheetTestletDriver over tests\perf test tree using
the default PerformanceTestlet)<br/><br/>
<code>traxapitest.bat TRAXAPITestClassName [<link
anchor="test-options">options</link>]</code>
<br/>(runs TRAX interface tests with Xalan-J 2.x, equivalent to <br/>
<code>runtest trax.TRAXAPITestClassName -load APITest.properties
[<link anchor="test-options">options</link>]</code><br/>
@@ -55,9 +55,11 @@
also be executed by hand, for those who wish to manage their own
classpaths and/or
simply pass all arguments to the tests on the command line, etc.</p>
<p>Sorry! We don't have .sh equivalents for the convenience .bat files -
- submissions of ports of these files are welcomed!</p>
- <p>We are also working on integrating the running of tests into
- the various Ant build.xml files, both for the testing build.xml
+ submissions of ports of these files are welcomed! We do plan to add an
Ant
+ task that can run Xalan Tests and Testlets, so in the future both
building
+ and running tests will mostly be done from an Ant buildfile, which is
+ cross-platform by default.</p>
+ <p>Some tests are partially integrated into our Ant build.xml files,
both for the testing build.xml
file as well as the one for Xalan-J 2.x itself. For example, to run
the Xalan-J 2.x Minitest, you may now do:<br/>
<code>cd xml-xalan\java</code><br/>
@@ -101,19 +103,25 @@
<p>To see all options, call a test with an illegal argument to force
it
to print out it's .usage(). You may mix setting options from a
properties
file and from the command line; command line options will take
precedence.</p>
- <p>For another description of options, see
<br/><code>xml-xalan\test\ContribTest.properties</code>,<br/>
- which describes most of them as used in the context of running the
ConformanceTest driver
- over the xml-xalan\tests\contrib tree of stylesheets. Simply change
inputDir and goldDir
- to run over a different set of files (like a conformance test suite,
which we hope to
+ <p>For another description of options, see
<br/><code>xml-xalan\test\conf.properties</code>,<br/>
+ which describes most of them as used in the context of running the
StylesheetTestletDriver
+ over the xml-xalan\tests\conf tree of stylesheets. Simply change
inputDir and goldDir
+ to run over a different set of files (like a performance test suite,
which we hope to
add soon).</p>
<note>Path-like options set in a properties file must use the local
system's
File.separator character, and backslashes \ must be escaped \\. The
checked in
- copy of ContribTest.properties is for a Windows platform.</note>
+ copy of conf.properties is for a Windows platform.</note>
<p>Quick list of options</p>
<anchor name="test-options-logfile"/>
<gloss>
<label>-logFile <ref>resultsFileName.xml</ref></label>
<item>sends test results to an XML-based results file</item>
+ <label>-loggingLevel <ref>nn</ref></label>
+ <item>determines how much information is sent to your logFile,
0=very little, 99=lots</item>
+ <label>-ConsoleLogger.loggingLevel <ref>nn</ref></label>
+ <item>determines how much information is sent just to the default
ConsoleLogger:
+ since often you won't be watching the console as the test is
running, you can set this
+ lower than your loggingLevel to speed the tests up a little</item>
<label>-load <ref>file.properties</ref></label>
<item>(read in a .properties file, that can set any/all of the
other opts)</item>
<label>-inputDir <ref>path/to/tests</ref></label>
@@ -130,16 +138,10 @@
<label>-flavor <ref>xalan|trax|trax.d2d</ref></label>
<item>which kind/flavor of Processor to test; see
<jump
href="apidocs/org/apache/qetest/xslwrapper/ProcessorWrapper.html">ProcessorWrapper.java</jump>
</item>
- <label>-noReuse</label>
- <item>will force the ProcessorWrapper recreate a new processor for
each file processed
- (normally, it re-uses the same processor when possible by calling
reset() or some
- like method)</item>
+ <label>-testlet <ref>TestletClassname</ref></label>
+ <item>For StylesheetTestletDriver, use a different class for the
testing algorithim</item>
<label>-debug</label>
<item>prints extra test debugging info</item>
- <label>-precompile</label>
- <item>will use a precompiled stylesheet, if applicable to that
ProcessorWrapper</item>
- <label>-noErrTest</label>
- <item>will skip running '*err' subdirectory tests, if applicable
(<ref>subject to change</ref>)</item>
</gloss>
<p>Note that most options work equivalently with either Xalan-J or
Xalan-C tests.</p>
</s2>