Do you think that large tables are being made per instance?

What do you do for external parsed entities?  Is there a way I could fake
it out into thinking xsl:include & xsl:import is a parsed entity?

-scott




                                                                                
                                   
                    "Ted Leung"                                                 
                                   
                    <[EMAIL PROTECTED]        To:     <[EMAIL PROTECTED]>       
                                
                    ia.com>              cc:     (bcc: Scott Boag/CAM/Lotus)    
                                   
                                         Subject:     Re: xalan crashes with 
docbook                               
                    01/21/00                                                    
                                   
                    06:10 PM                                                    
                                   
                    Please                                                      
                                   
                    respond to                                                  
                                   
                    xerces-dev                                                  
                                   
                                                                                
                                   
                                                                                
                                   




You definitely cannot re-enter the same parser instance while
another thread is executing inside it.  I don't see an easy solution
other than new'ing a separate parser instance.

----- Original Message -----
From: "Scott Boag/CAM/Lotus" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Thursday, January 20, 2000 7:00 PM
Subject: Re: xalan crashes with docbook


>
> > unless the document is just
> > pathologically deeply nested
>
> The DocBook stylesheets are pretty pathologically nested.
>
> > (or circular of course :-) Do you try to catch
> > circular references?
>
> Yes.  (Unless there is a bug...)
>
> We need to do more analysis on this to see how many parsers are parsing
at
> a time.  I really only worried about per-instance memory overhead for
> tables and the like.  For the moment, only the Java version concerns me.
>
> -scott
>
>
>
>
>
>                     [EMAIL PROTECTED]
>                     .com                 To:
[EMAIL PROTECTED]
>                                          cc:     (bcc: Scott
Boag/CAM/Lotus)
>                     01/20/00             Subject:     Re: xalan crashes
with docbook
>                     02:53 PM
>                     Please
>                     respond to
>                     xerces-dev
>
>
>
>
>
>
>
>
>
> For the C++ version, this should be ok. You can just create new parsers
to
> handled nested parses. You are only limited by the virtual memory on the
> machine. Definitely you cannot reuse the parser at that point though, so
> you will have to create new parsers for each nested include. They aren't
> all that terribly big in the C++ version really. They aren't trivial, but
> they aren't big enough to worry about it unless the document is just
> pathologically deeply nested (or circular of course :-) Do you try to
catch
> circular references? Its done within the parser for referenced entities,
> but you'd have to do it yourself for any kinds of references you parse
> yourself.
>
> ----------------------------------------
> Dean Roddey
> Software Weenie
> IBM Center for Java Technology - Silicon Valley
> [EMAIL PROTECTED]
>
>
>
> "Scott Boag/CAM/Lotus" <[EMAIL PROTECTED]> on 01/20/2000 11:41:42 AM
>
> Please respond to [EMAIL PROTECTED]
>
> To:   Steve Fisher <[EMAIL PROTECTED]>
> cc:   Robert [EMAIL PROTECTED], xalan-dev@xml.apache.org,
>       [EMAIL PROTECTED]
> Subject:  Re: xalan crashes with docbook
>
>
>
>
> Yes, Rob has reproduced it.  My theory right now is that the recursive
> includes in docbook are causing multiple parsers to be created, since the
> parser is not reentrant, and the parsers are likely to have some big
> tables.  Rob said that he could run docbook files by boosting the memory
in
> the vm.
>
> We have this posted as an SPR (Software Problem Report), but it will take
> some time to resolve.
>
> Xerces folks: the problem is that I'm handling xsl:include while I'm in
the
> middle of a SAX parse, and I don't think the currently executing parser
is
> reentrant at that point.  Do you have any ideas on how this might be
> resolved?
>
> The other thing I might be able to try is to post a stack of the
> xsl:includes and xsl:imports as the processing occurs, and then process
> those stylesheets after the primary parse is completed... i.e. try to
build
> the stylesheets sequentially, instead of recursivly.  I'll have to look
> into how feasible this is.
>
> -scott
>
>
>
>
>
>                     Steve Fisher
>                     <[EMAIL PROTECTED]        To:     Scott Boag/CAM/Lotus
> <[EMAIL PROTECTED]>
>                     l.ac.uk>             cc:
>                                          Subject:     Re: xalan crashes
> with docbook
>                     01/20/00
>                     01:16 PM
>
>
>
>
>
>
> On Fri, 14 Jan 2000, Scott Boag/CAM/Lotus wrote:
>
> >
> > OK, I'll take a look at this.  I wasn't aware of any problems with
> docbook.
> > The error is occuring during the build of the stylesheet, and looks to
be
> a
> > genuine out of memory error, so I'm a bit surprised.
> >
> > -scott
> >
> >                     Steve Fisher
>
> >                     <[EMAIL PROTECTED]        To:
xalan-dev@xml.apache.org
>
> >                     l.ac.uk>             cc:     (bcc: Scott
> Boag/CAM/Lotus)
> >                                          Subject:     xalan crashes
with
> docbook
> >                     01/14/00
>
> >                     02:28 PM
>
>
> Did you get anywhere with this out of memory condition? - can you
> reproduce the error?
>
> Steve
>
>
>
>
>
>
>
>
>
>
>
>
>





Reply via email to