Thanks for the response. About setting the buffer size, this looks like
it could be what I am looking for. A few questions:
1. Do I have to set the buffer size on each transformer and the
serializer as well as the generator? What about setting it on the pipeline?
2. Does the full amount of the buffer automatically get allocated for
each request, or does it grow gradually based on the xml stream size?
I have a lot of steps in the pipeline, so I am worried about the impact
of creating too many buffers even if they are relatively small. A 1 Meg
buffer might be too much if it is created for every element of every
pipeline for every request.
On an unrelated note, is there some way to configure caching so that
nothing is cached that is larger than a certain size? I'm worried that
this might be a caching issue rather than a buffer issue.
How do you read the object graph from the heap dump? To tell you the
truth, I'm not sure. This is the hierarchy generated by the Heap
Analyzer tool from IBM, and is from a heap dump on an AIX box running
the IBM JRE. My guess as to the Object referencing the
ComponentsSelector is that the ArrayList is not generified, so the
analyzer doesn't know the actual type of the Object being referenced.
What the object actually is would depend on what
CachingProcessorPipeline put into the ArrayList. That is just a guess,
though. And I have no explanation for the link between
FOM_Cocoon$CallContext and ConcreteCallProcessor. Perhaps things were
different in the 2.1.9 release?
Regarding when this problem occurs, it only happens after the system has
been under heavy load for many days.
Joerg Heinicke wrote:
On 23.04.2008 20:56, Bruce Atherton wrote:
Here are some specifics, in case they are relevant. One heap analysis
showed 1.5 Gigabytes of memory being held by the object at the end of
this tree (package names suppressed to keep this readable):
- ScriptableObject
- FOM_Cocoon
- FOM_Cocoon$CallContext
- ConcreteTreeProcessor
- InvokeContext
- CachingProcessorPipeline
- ArrayList
- Object
- ComponentsSelector
- ComponentsSelector
- Collections$SynchronizedMap
- HashMap
- Array of HashMap$Entry
- HashMap$Entry (size including children =
1.5Gig)
How do I read this? I tried to reproduce it in the code but it does
not make sense to me. FOM_Cocoon$CallContext has no reference to the
ConcreteTreeProcessor or the other way around. Object can neither have
a reference to ArrayList or ComponentsSelector. So what does it mean?
That HashMap$Entry object had the following child tree, each one a
bit smaller than its parent. Only the largest child shown at each level:
- HashMap$Entry (a different one)
- TraxTransformer
- TransformerHandlerImpl
- SAXResult
- JxTemplateTransformer
- JxTemplateGenerator$TranformerAdapter$TemplateConsumer
After this there are two JxTemplateGenerators that split the size.
The bigger one goes to FormsTemplateTransformer, CIncludeTransformer,
and on and on into a long loop of JxTemplateGenerator$StartElement
and JxTemplateGenerator$EndElement.
Do you run into the OOME just by one request or only after certain
number of requests since the content is stored "somewhere"?
It seems to me that it should be possible to put a limiter on the
transformers that says that if more than a certain volume of data is
put through, an exception is thrown. I know that part of the page may
have been rendered before the error page shows up, but I am fine with
that. I'd be happy to limit each part of the pipeline to outputting
no more than 10Meg.
Does it mean you want to flush the OutputStream after certain amount
of data? What about [1] and [2]?
Joerg
[1] https://issues.apache.org/jira/browse/COCOON-2168
[2] http://marc.info/?l=xml-cocoon-dev&m=120477640924395&w=4
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]