I already configured to start jvm with 800MB.  Can you tell me what's
normally causing the outofmemory exception?  Is it during the search
because of the index file is too big?  or because there are too many
hits?
 
I don't know how to split the index.  right now I have one single
multi-GB index file.  how do I split them into smaller index files?
Does Lucene provide any mechanism to do that?
 
thanks,
 
Jeff
 

        -----Original Message----- 
        From: Stefan Groschupf 
        Sent: Fri 12/16/2005 2:51 AM 
        To: [email protected] 
        Cc: 
        Subject: Re: best strategy to deal with large index file
        
        

        First of all check your memory setup of your tomcat, I think by 
        default it uses only 64 MB RAM.
        You need to change this manually, see tomcat documentation.
        Second I suggest to split the index and run multiple search
servers.
        
        How to:
        
http://wiki.media-style.com/display/nutchDocu/setup+multiple+search
        +sever
        
        
        Am 16.12.2005 um 09:41 schrieb Jeff Liang:
        
        > Hi all,
        >
        > my index file is huge because of large set of data.  when I do
> search, I
        > get outofmemory exception all the time.  it's also bad for
backup
        > because I can't do incremental backup after adding new
documents.
        >
        > What's the best strategy to deal with large index file?  is
there a
        > Lucene built-in method to split the index file?
        >
        > thanks,
        >
        > Jeff
        >
        
        

Reply via email to