Dear Dave

Thanks for your reply. Please find my comments below.

> [EMAIL PROTECTED] wrote:
>>
>> I am processing 80 XML messages per second and I am encountering
>> OutOfMemory problem, and I have two questions.
>>
>> 1. I create and delete XercesDOMParser for each message. This may not be
>> efficient but I do not know how the memoy management is working
>> internally
>> by reusing a parser. Is the memory keep increasing by reusing a parser ?
>> How do I release the previous message related memory?
> In general, using the DOM is not as efficient as using SAX-based parsing.
> You can certainly re-use a XercesDOMParser instance, but it might not help
> that much

Unfortunately, SAX is not an option because of the compatibility to
existing code.
My question was that what happens to the memory allocated for the previous
message parsing, and how I can delete them if I have to.

>
>>
>> 2. Everytime I create a new XercesDOMParser, it seems to need a chink of
>> memory. XMLReader inside of XercesDOMParser requires about 164KB. After
>> prcessing about 500K messages, OS generates OutOfMemory exception while
>> my
>> process is using about 650MB only. Why does this happen? My guess is
>> that
>> there is memory fragmentation and OS cannot find 164KB of continous
>> memory
>> space. How can I avoid this problem?
> Are you sure your application isn't leaking memory somewhere?  Normally,
> memory for a DOM instance is allocated in large blocks, so fragmentation
> should not be a problem.  It might be, however, if your code is leaking.

Maybe, but I would call it a feature ^^.

There is still a problem without memory leaking, I think.
Let's assume that the program is increasing memory space continusly.
Let me clarify this by small example.

loop:
   1: get XML message
   2: create a parser
   3: parse
   4: do my job
   5. delete a parser.
   6. add small data to history list. Let's assume this requires 10B in
dynamic memory


So we allocated 160KB in step 2 and deallocated it in step 5, but memory
used in step 6 may use the space returned by step 5. Then the same memory
space cannot be used for next iteration. By this fragmentation, OS will
not find any more big space for step 2 soon. Again this is only a guess.

HK


>
> Dave
>
>


Reply via email to