Hi,
I do not set a RAM Buffer size, I assume default is 16MB.
My server runs with 80MB heap size, before starting lucene about 50MB is used.
In a production environment I run in this problem with heap size set to 750MB
with no other activity on the server (nighttime), though since then I diagnosed
some problem with my code as well. I just reproduced it with 80MB but I guess I
can reproduce it with 100MB heap as well, just takes longer.
Here is the stack, I keep the dump for
java.lang.OutOfMemoryError: Java heap space
Dumping heap to c:\auto_heap_intern.prof ...
Heap dump file created [97173841 bytes in 3.534 secs]
ERROR lucene.SearchManager - Failure in index daemon:
java.lang.OutOfMemoryError: Java heap space
at java.util.HashSet.<init>(HashSet.java:86)
at
org.apache.lucene.index.DocumentsWriter.initFlushState(DocumentsWriter.java:540)
at
org.apache.lucene.index.DocumentsWriter.closeDocStore(DocumentsWriter.java:367)
at
org.apache.lucene.index.IndexWriter.flushDocStores(IndexWriter.java:1703)
at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3534)
at org.apache.lucene.index.IndexWriter.flush(IndexWriter.java:3450)
at
org.apache.lucene.index.IndexWriter.closeInternal(IndexWriter.java:1638)
at org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1602)
at org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1578)
Heap Histogram shows:
class org.apache.lucene.index.FreqProxTermsWriter$PostingList 116736
(instances) 3268608 (size)
Well, something I should do differently ?
Stefan
-----Ursprüngliche Nachricht-----
Von: Michael McCandless [mailto:[email protected]]
Gesendet: Mi 24.06.2009 10:48
An: [email protected]
Betreff: Re: OutOfMemoryError using IndexWriter
How large is the RAM buffer that you're giving IndexWriter? How large
a heap size do you give to JVM?
Can you post one of the OOM exceptions you're hitting?
Mike
On Wed, Jun 24, 2009 at 4:08 AM, stefan<[email protected]> wrote:
> Hi,
>
> I am using Lucene 2.4.1 to index a database with less than a million records.
> The resulting index is about 50MB in size.
> I keep getting an OutOfMemory Error if I re-use the same IndexWriter to index
> the complete database. This is though
> recommended in the performance hints.
> What I now do is, every 10000 Objects I close the index (and every 50 close
> actions optimize it) and create a new
> IndexWriter to continue. This process works fine, but to me seems hardly the
> recommended way to go.
> I've been using jhat/jmap as well as Netbeans profiler and am fairly sure
> that this is a problem related to Lucene.
>
> Any Ideas - or post this to Jira ? Jira has quite a few OutOfMemory postings
> but they all seem closed in Version 2.4.1.
>
> Thanks,
>
> Stefan
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]