I don't think it's not the problem of small heap. I have run it in a
machine with 24 core and 64GB memory. I am using default jvm
arguments. I found that it use about 10GB heap space and still cause
this problem. The total index is less than 3GB in disk.
I don't know why lucene create so much ConcurrentHashMap$HashEntry and
IdentityWeakReference. it seems lucene want to cache something using
WeakIdentityMap.

On Mon, Sep 22, 2014 at 9:17 PM, Shawn Heisey <[email protected]> wrote:
> On 9/22/2014 6:42 AM, Li Li wrote:
>>   I have an index of about 30 million short strings, the index size is
>> about 3GB in disk
>>  I have give jvm 5gb memory with default setting in ubuntu 12.04 of sun jdk 
>> 7.
>>  When I use 20 theads, it's ok. But If I run 30 threads. After a
>> while. The jvm is doing nothing but gc.
>
> When your JVM is doing constant GC, your heap isn't big enough to do the
> job it's being asked to do by the software.  Your options are to
> increase your heap size, or change how your program works so that it
> needs less heap.
>
> Right now, if you have the memory available, simply make your heap
> larger.  I'm not really sure how to reduce heap requirements for a
> custom Lucene program, although if any of these notes for Solr (which is
> of course a Lucene program) can be helpful to you, here they are:
>
> http://wiki.apache.org/solr/SolrPerformanceProblems#Java_Heap
>
> Thanks,
> Shawn
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to