On Wed, Mar 31, 2010 at 3:23 AM, Cech. Ulrich <[email protected]> wrote:
> Hello,
>
> I just running a mass test with some hundrets of thousends files (all have 15 
> properties), very structured repo tree (no more than 2000 direct child 
> nodes!).
>
> Everything works fine so long, but I see a very slight increasing of used 
> memory.
> Every document storage ends with a "session.save()", so there are no 
> transient changes anymore. Every 100 documents, I do a little "break" and 
> call System.gc().
>
> I see, that the used memory is hold on a certain value (some kb more and 
> less) for a yet long time (about 30-60 min.), then there is an increase of 
> some MB, holding for next 30.60 min and so on.
>
> I did not change any cache settings, so the cache should only the 16MB 
> (default values, if I look exactly).
>
> Is this perhaps the index, which consumes more and more memory? Is there a 
> setting to limit the maximum used memory? I would solve this, so that the 
> system does not crash down with an OutOfMemoryException some days.
>

Are you talking about JVM memory usage, or just memory consumption on
the machine?  I ask because we had some recent tests show that the
network connections opened were not going away after they were through
being used.  But I think this went away in recent version.  Of course,
this is specific to my SPI/Davex networked setup.

Reply via email to