Thanks for your help Mark. Lemme explore a little more and see if some one else can help me out too. :)
> Date: Tue, 22 Jul 2008 16:53:47 -0400> From: [EMAIL PROTECTED]> To: > solr-user@lucene.apache.org> Subject: Re: Out of memory on Solr sorting> > > Someone else is going to have to take over Sundar - I am new to solr > > myself. I will say this though - 25 million docs is pushing the limits > of a > single machine - especially with only 2 gig of RAM, especially with > any > sort fields. You are at the edge I believe.> > But perhaps you can get by. > Have you checked out all the solr stats on > the admin page? Maybe you are > trying to load up to many searchers at a > time. I think there is a setting > to limit the number of searchers that > can be on deck...> > sundar shankar > wrote:> > Hi Mark,> > I am still getting an OOM even after increasing the > heap to 1024. The docset I have is > > > > numDocs : 1138976 maxDoc : 1180554 > > > > > Not sure how much more I would need. Is there any other way out of > this. I noticed another interesting behavior. I have a Solr setup on a > personal Box where I try out a lot of different configuration and stuff > before I even roll the changes out to dev. This server has been running with > a similar indexed data for a lot longer than the dev box and it seems to have > fetched the results out properly. > > This box is a windows 2 core processor > with just about a gig of memory and the whole 1024 megs have been allocated > to heap. The dev is a linux with over 2 Gigs of memory and 1024 allocated to > heap now. :S> > > > -Sundar> >> >> >> > > >> Date: Tue, 22 Jul 2008 13:17:40 > -0700> From: [EMAIL PROTECTED]> To: solr-user@lucene.apache.org> Subject: Re: > Out of memory on Solr sorting> > Mark,> > Question: how much memory I need > for 25,000,000 docs if I do a sort by > <string> field, 256 bytes. 6.4Gb?> > > > Quoting Mark Miller <[EMAIL PROTECTED]>:> > > Because to sort efficiently, > Solr loads the term to sort on for each> > doc in the index into an array. > For ints,longs, etc its just an array> > the size of the number of docs in > your index (i believe deleted or> > not). For a String its an array to hold > each unique string and an array> > of ints indexing into the String array.> > >> > So if you do a sort, and search for something that only gets 1 doc as a> > > hit...your still loading up that field cache for every single doc in> > > your index on the first search. With solr, this happens in the> > background > as it warms up the searcher. The end story is, you need more> > RAM to > accommodate the sort most likely...have you upped your xmx> > setting? I > think you can roughly say a 2 million doc index would need> > 40-50 MB > (depending and rough, but to give an idea) per field your> > sorting on.> >> > > - Mark> >> > > > >> > > > _________________________________________________________________> > Wish to > Marry Now? Click Here to Register FREE> > > http://www.shaadi.com/registration/user/index.php?ptnr=mhottag> > > > _________________________________________________________________ Missed your favourite programme? Stop surfing TV channels and start planning your weekend TV viewing with our comprehensive TV Listing http://entertainment.in.msn.com/TV/TVListing.aspx