Hello,

I just running a mass test with some hundrets of thousends files (all have 15 
properties), very structured repo tree (no more than 2000 direct child nodes!).

Everything works fine so long, but I see a very slight increasing of used 
memory.
Every document storage ends with a "session.save()", so there are no transient 
changes anymore. Every 100 documents, I do a little "break" and call 
System.gc().

I see, that the used memory is hold on a certain value (some kb more and less) 
for a yet long time (about 30-60 min.), then there is an increase of some MB, 
holding for next 30.60 min and so on.

I did not change any cache settings, so the cache should only the 16MB (default 
values, if I look exactly).

Is this perhaps the index, which consumes more and more memory? Is there a 
setting to limit the maximum used memory? I would solve this, so that the 
system does not crash down with an OutOfMemoryException some days.

Thanks for any hints/advices,
Ulrich

Reply via email to