There is memory used for each facet. All of the facets are loaded for
any facet query. Your best shot is to limit the number of facets.

On Tue, Jun 29, 2010 at 11:42 AM, olivier sallou
<olivier.sal...@gmail.com> wrote:
> I have given 6G to Tomcat. Using facet.method=enum and facet.limit seems to
> fix the issue with a few tests, but I do know that it is not a "final"
> solution. Will work under certain configurations.
>
> Real "issue" is to be able to know what is the required RAM for an index...
>
> 2010/6/29 Nagelberg, Kallin <knagelb...@globeandmail.com>
>
>> How much memory have you given the solr jvm? Many servlet containers have
>> small amount by default.
>>
>> -Kal
>>
>> -----Original Message-----
>> From: olivier sallou [mailto:olivier.sal...@gmail.com]
>> Sent: Tuesday, June 29, 2010 2:04 PM
>> To: solr-user@lucene.apache.org
>> Subject: Faceted search outofmemory
>>
>> Hi,
>> I try to make a faceted search on a very large index (around 200GB with
>> 200M
>> doc).
>> I have an out of memory error. With no facet it works fine.
>>
>> There are quite many questions around this but I could not find the answer.
>> How can we know the required memory when facets are used so that I try to
>> scale my server/index correctly to handle it.
>>
>> Thanks
>>
>> Olivier
>>
>



-- 
Lance Norskog
goks...@gmail.com

Reply via email to