Hi

Thank you for your reponse

Which version of solr?
I inherited the project so not exactly sure ... in CHANGES.txt it says
Apache Solr Version 1.4-dev
$Id: CHANGES.txt 793090 2009-07-10 19:40:33Z yonik $

What garbage collection parameters?
ulimit -n 100000 ; nohup java -server -XX:+UseConcMarkSweepGC
-XX:+CMSIncrementalMode  -XX:+UseParNewGC -XX:+CMSPermGenSweepingEnabled
-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:-TraceClassUnloading
-XX:+UseParNewGC -XX:ParallelGCThreads=4 -Xmx5000m
-Dsolr.solr.home=/opt/solr_env/index
-Djava.util.logging.config.file=/opt/solr_env/index/logging.properties
-Djetty.host=0.0.0.0 -DSTOP.PORT=8079 -DSTOP.KEY=stop.now
-Dcom.sun.management.jmxremote=true -Dcom.sun.management.jmxremote.port=3500
-Dcom.sun.management.jmxremote.ssl=false -jar start.jar > solr.log &

What version of java?
Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_18-b02, mixed mode). I also
tried with 1.6 but didn't change. Changing -Xmx from 5000 to 3500 causes the
problem to happen quicker

The machine is an xlarge machine on amazon

7 GB of memory
20 EC2 Compute Units (8 virtual cores with 2.5 EC2 Compute Units each)
1690 GB of instance storage
64-bit platform
I/O Performance: High
API name: c1.xlarge


Thank you for your help


matt


On Thu, Jan 21, 2010 at 11:57 PM, Lance Norskog <goks...@gmail.com> wrote:

> Which version of Solr? Java? What garbage collection parameters?
>
> On Thu, Jan 21, 2010 at 1:03 PM, Matthieu Labour <matth...@strateer.com>
> wrote:
> > Hi
> >
> > I have been requested to look at a solr instance that has been patched
> with
> > our own home grown patch to be able to handle 1000 cores on a solr
> instance
> >
> > The solr instance doesn't perform well. Within 12 hours, I can see the
> > garbage collection taking a lot of time and query & update requests are
> > timing out (see below )
> >
> > [Full GC [PSYoungGen: 673152K->98800K(933888K)] [PSOldGen:
> > 2389375K->2389375K(2389376K)] 3062527K->2488176K(3323264K) [PSPermGen:
> > 23681K->23681K(23744K)], 4.0807080 secs] [Times: user=4.08 sys=0.00,
> > real=4.08 secs]
> >
> > org.apache.solr.client.solrj.SolrServerException:
> > java.net.SocketTimeoutException: Read timed out
> >        at
> >
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:472)
> >        at
> >
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243)
> >        at
> >
> org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
> >
> >
> > I used yourkit to track down eventual memory leaks but didn't succeed in
> > finding one
> >
> > The biggest objects using up the memory seem to be org.apache.lucene.Term
> > and org.apache.lucene.TermInfo
> >
> > The total size of the data directory in index is 46G with a typical big
> core
> > being 100000 documents and size of 103M
> >
> > There are lots of search requests and indexing happening
> >
> > I am posting to the mailing list hoping to hear that we must be doing
> > something completely wrong because it doesn't seem to me that we are
> pushing
> > the limit. I would appreciate any tips as where to look at etc... to
> > troubleshoot and solve the issue
> >
> > Thank you for your help !
> >
> > matt
> >
>
>
>
> --
> Lance Norskog
> goks...@gmail.com
>

Reply via email to