Hi!

Thank you for your kind reply. I'm aware of how the GC works and the problem
was that on my system I actually got an OutOfMemory when increasing the 
iterations
to 100,000. This did not happen with the old xml4j. However the problem was my
JVM. Thanks for pointing that out!

If anyone is running Linux and Sun's JDK 1.2pre-v2 and is experiencing memory 
leaks
with Xerces-J, the problem is in the JDK. After upgrading to JDK 1.2 RC2 
everything
works like a charm!

Thank you all for your help.

/Christian Lizell



"George T. Joseph" wrote:
> 
> Christian,
> 
> I ran your sample and found that used memory does increase, but that's not
> necessarily a problem or a leak.  System.gc() is only a hint to the JVM that 
> GC
> could be run.  If the JVM detects that there's still plenty of free space on 
> the
> heap, it may not bother.  In your example, add the display of rt.totalMemory()
> as well as (rt.totalMemory()-rt.freeMemory()) and you'll probably see that by
> the time your program ends, only 50% of the heap is actually used (depending 
> on
> the JVM used).  The program ends before GC was needed.
> 
> If you want to see the GC in action, change the iterations from 10,000 to
> 100,000 and you should see that used memory rises and falls as the GC is run.
> You may also see that the program runs at a slower rate for the same reason.
> 
> What operating system and Java VM are you running?  Each combination of the 
> two
> will probably result in different memory utilization profiles.  For instance,
> the HotSpot performance engine used with Sun's JVMs seems to use more memory 
> and
> run GC less while the Classic version of the same JVM seems to use less memory
> and run GC more often.  Different JVM may also have differently sized initial
> heaps and different heap size limits.
> 
> george
> 
> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> Behalf Of Christian Lizell
> Sent: Friday, February 11, 2000 4:46 AM
> To: [EMAIL PROTECTED]
> Subject: Caching problem? (was: Re: cloneNode() eats memory?)
> 
> Hi!
> 
> I have tried every possible way that I can think of
> in order to reclaim memory from a created node.
> 
> Here is a very simple test case illustrating the problem:
> 
> <CODE>
> import org.w3c.dom.*;
> import org.apache.xerces.dom.*;
> 
> public class CloneTest3 {
>     public static void main(String[] args) {
>         // Make Nodes
>         for (int i=0; i<=10000; i++) {
>             new DocumentImpl().createTextNode("");
>             if ((i%1000) == 0) {
>                 System.gc(); System.runFinalization();
>                 Runtime rt = Runtime.getRuntime();
>                 System.out.println("Memory usage after " + i + " empty text 
> nodes: " +
> (rt.totalMemory()-rt.freeMemory()));
>             }
>         }
>     }
> }
> </CODE>
> 
> When executing the memory for the created text nodes are never
> reclaimed:
> 
> <CODE>
> Memory usage after 0 clones: 2018096
> Memory usage after 1000 clones: 2267040
> Memory usage after 2000 clones: 2515040
> Memory usage after 3000 clones: 2763040
> Memory usage after 4000 clones: 3011040
> Memory usage after 5000 clones: 3259040
> Memory usage after 6000 clones: 3507040
> Memory usage after 7000 clones: 3755040
> Memory usage after 8000 clones: 4003040
> Memory usage after 9000 clones: 4251040
> Memory usage after 10000 clones: 4499040
> </CODE>
> 
> Isn't this weird?
> How can I reclaim the memory?
> Are the nodes statically cached in some class?
> 
> Any help on this is very much appreciated. I am entering panic mode here. :)
> 
> Thanks,
> /Christian Lizell

Reply via email to