We do use document() from 1-3 times with every transformation, but there 
are only a handful of unique documents that get loaded.  The total of 
all documents loaded is less than 1 Meg of text XML.  If we load the 
same documents over and over, would we see increasing memory 
consumption, or does it reuse the cached version?
  If it is simply holding the documents in memory and reusing them, then 
it is probably not the primary problem.
  I will need to do some analysis to see if I can figure out what it is 
that is causing the memory consumption.  If I can narrow it down I will 
develop a test case and send it your way.

Chris

[EMAIL PROTECTED] wrote:

>If you're using document() heavily, there are some known issues regarding
>whether/when/how-long a DTM is retained in memory. There's a PI that can be
>used as a workaround in the common case of an <xsl:for-each> with a
>document()-based expression in its select, but a general and automated
>answer is still being developed. (Basically, the pre-DTM default was not to
>cache source documents unless explicitly directed otherwise; the DTM-based
>code is currently defaulting the other way.)
>
>Outside of that, I don't think we're aware of any particular memory
>problems; indeed, some users have said the DTM model appears to be saving
>them a significant amount of space. If you've got an isolatable testcase
>(perferably one that can be run from the command line rather than needing a
>complex surrounding environment), we could run it through a memory usage
>analyser and see where the space is actually going.
>
>


Reply via email to