On Thu, Mar 17, 2011 at 3:55 PM, Geeta Subramanian
<gsubraman...@commvault.com> wrote:
> Hi Yonik,
>
> I am not setting the ramBufferSizeMB or maxBufferedDocs params...
> DO I need to for Indexing?

No, the default settings that come with Solr should be fine.
You should verify that they have not been changed however.

An older solrconfig that used maxBufferedDocs could cause an OOM with
large documents since it buffered a certain amount of documents
instead a certain amount of RAM.

Perhaps post your solrconfig (or at least the sections related to
index configuration).

-Yonik
http://lucidimagination.com


> Regards,
> Geeta
>
> -----Original Message-----
> From: ysee...@gmail.com [mailto:ysee...@gmail.com] On Behalf Of Yonik Seeley
> Sent: 17 March, 2011 3:45 PM
> To: Geeta Subramanian
> Cc: solr-user@lucene.apache.org
> Subject: Re: memory not getting released in tomcat after pushing large 
> documents
>
> In your solrconfig.xml,
> Are you specifying ramBufferSizeMB or maxBufferedDocs?
>
> -Yonik
> http://lucidimagination.com
>
>
> On Thu, Mar 17, 2011 at 12:27 PM, Geeta Subramanian 
> <gsubraman...@commvault.com> wrote:
>> Hi,
>>
>>  Thanks for the reply.
>> I am sorry, the logs from where I posted does have a Custom Update Handler.
>>
>> But I have a local setup, which does not have a custome update handler, its 
>> as its downloaded from SOLR site, even that gives me heap space.
>>
>> at java.util.Arrays.copyOf(Unknown Source)
>>        at java.lang.AbstractStringBuilder.expandCapacity(Unknown
>> Source)
>>        at java.lang.AbstractStringBuilder.append(Unknown Source)
>>        at java.lang.StringBuilder.append(Unknown Source)
>>        at org.apache.solr.handler.extraction.Solrtik
>> ContentHandler.characters(SolrContentHandler.java:257)
>>        at
>> org.apache.tika.sax.ContentHandlerDecorator.characters(ContentHandlerD
>> ecorator.java:124)
>>        at
>> org.apache.tika.sax.SecureContentHandler.characters(SecureContentHandl
>> er.java:153)
>>        at
>> org.apache.tika.sax.ContentHandlerDecorator.characters(ContentHandlerD
>> ecorator.java:124)
>>        at
>> org.apache.tika.sax.ContentHandlerDecorator.characters(ContentHandlerD
>> ecorator.java:124)
>>        at
>> org.apache.tika.sax.SafeContentHandler.access$001(SafeContentHandler.j
>> ava:39)
>>        at
>> org.apache.tika.sax.SafeContentHandler$1.write(SafeContentHandler.java
>> :61)
>>        at
>> org.apache.tika.sax.SafeContentHandler.filter(SafeContentHandler.java:
>> 113)
>>        at
>> org.apache.tika.sax.SafeContentHandler.characters(SafeContentHandler.j
>> ava:151)
>>        at
>> org.apache.tika.sax.XHTMLContentHandler.characters(XHTMLContentHandler
>> .java:175)
>>        at
>> org.apache.tika.parser.txt.TXTParser.parse(TXTParser.java:144)
>>        at
>> org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:142)
>>        at
>> org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:99
>> )
>>        at
>> org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:11
>> 2)
>>        at
>> org.apache.solr.handler.extraction.ExtractingDocumentLoader.load(Extra
>> ctingDocumentLoader.java:193)
>>        at
>> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(Con
>> tentStreamHandlerBase.java:54)
>>        at
>> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandle
>> rBase.java:131)
>>        at
>> org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.handleR
>> equest(RequestHandlers.java:237)
>>        at org.apache.solr.core.SolrCore.execute(SolrCore.java:1323)
>>        at
>> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.
>> java:337)
>>
>>
>>
>> Also, in general, if I post 25 * 100 mb docs to solr, how much should be the 
>> ideal heap space set?
>> Also, I see that when I push a single document of 100 mb, in task manager I 
>> see that about 900 mb memory is been used up, and some subsequent push keeps 
>> the memory about 900mb, so at what point there can be OOM crash?
>>
>> When I ran the YourKit Profiler, I saw that around 1 gig of memory was just 
>> consumed by char[] , String [].
>> How can I find out who is creating these(is it SOLR or TIKA) and free up 
>> these objects?
>>
>>
>> Thank you so much for your time and help,
>>
>>
>>
>> Regards,
>> Geeta
>>
>>
>>
>> -----Original Message-----
>> From: ysee...@gmail.com [mailto:ysee...@gmail.com] On Behalf Of Yonik
>> Seeley
>> Sent: 17 March, 2011 12:21 PM
>> To: solr-user@lucene.apache.org
>> Cc: Geeta Subramanian
>> Subject: Re: memory not getting released in tomcat after pushing large
>> documents
>>
>> On Thu, Mar 17, 2011 at 12:12 PM, Geeta Subramanian 
>> <gsubraman...@commvault.com> wrote:
>>>        at
>>> com.commvault.solr.handler.extraction.CVExtractingDocumentLoader.load
>>> (
>>> CVExtractingDocumentLoader.java:349)
>>
>> Looks like you're using a custom update handler.  Perhaps that's 
>> accidentally hanging onto memory?
>>
>> -Yonik
>> http://lucidimagination.com
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> ******************Legal Disclaimer***************************
>> "This communication may contain confidential and privileged material
>> for the sole use of the intended recipient.  Any unauthorized review,
>> use or distribution by others is strictly prohibited.  If you have
>> received the message in error, please advise the sender by reply email
>> and delete the message. Thank you."
>> ****************************************************************
>>
>
>
>
>
>
>
>
>
>
>
>
> ******************Legal Disclaimer***************************
> "This communication may contain confidential and privileged material
> for the sole use of the intended recipient.  Any unauthorized review,
> use or distribution by others is strictly prohibited.  If you have
> received the message in error, please advise the sender by reply
> email and delete the message. Thank you."
> ****************************************************************
>

Reply via email to