Hi Neil,

Neil Graham writes:
> The methodology behind the tests isn't discussed very much though:  In
> particular, what options did you give Xerces?  For instance, 

We should just post the source code for the tests!  So I've done
that. I haven't done anything to make the tests buildable yet, but
I've posted the code. As soon as our webserver is synced up (give it
a couple hours), the page:

http://workshop.bea.com/xmlbeans/schemaandperf.jsp

should now link to perftest source code at:

http://workshop.bea.com/xmlbeans/xmlbeans-perftest.zip

Answers as far as I know are below.  Please do correct us if we
should be testing Xerces (or any of the other parsers) in a
better way.

> were you using
> our "deferred DOM"--a DOM implementation that tries to refrain from
> "fluffing up" objects until they're needed?  There's evidence 

We measured a straightforward DOM load, with no deferring afaik.

> Also, is parser start-up time factored in?  We believe that 
> the most common
> use-cases where performance is critical will involve reuse of parser
> objects; therefore, we always use at least 100 "warm-up" 

Yes. We let the various parsers "warm-up" for about 5
seconds before we start measuring them.

> Finally, were Xerces grammar-caching capabilities used for 
> the validation
> tests?  Naturally, reading a schema is a pretty slow process; 

I believe so.  You can check our work and let us know if what
we are doing with XmlGrammarPool does this correctly
(Test6Xerces.java).

David Bau

---------------------------------------------------------------------
In case of troubles, e-mail:     [EMAIL PROTECTED]
To unsubscribe, e-mail:          [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to