I did close searcher after each done. Maybe I should try the
CachedSearcher someone posted before ...  Is there a final version?

Scott Ganyo wrote:

> Are you closing the searcher after each when done?
>
> No: Waiting for the garbage collector is not a good idea.
>
> Yes: It could be a timeout on the OS holding the files handles.
>
> Either way, the only real option is to avoid thrashing the searchers...
>
> Scott
>
> > -----Original Message-----
> > From: Hang Li [mailto:[EMAIL PROTECTED]]
> > Sent: Tuesday, July 23, 2002 10:10 AM
> > To: Lucene Users List
> > Subject: Re: Too many open files?
> >
> >
> > Thanks for your quick reponse, I still want to know why we ran out of
> > file descriptors.
> >
> > --Yup.  Cache and reuse your Searcher as much as possible.
> >
> > --Scott
> >
> > > -----Original Message-----
> > > From: Hang Li [mailto:[EMAIL PROTECTED]]
> > > Sent: Tuesday, July 23, 2002 9:59 AM
> > > To: Lucene Users List
> > > Subject: Too many open files?
> > >
> > >
> > > >
> > >
> > > I have seen a lot postings about this topic. Any final thoughts?
> > >
> > > We did a simple stress test, Lucene would produce this error
> > > between 30 - 80
> > > concurren searches.  The index directory has 24 files (15
> > fields), and
> >
> > > "
> > > ulimit -n
> > > 32768
> > > ",
> > >
> > > there should be more than enough FDs.  Note, we did not do
> > > any writings to index
> > > while we were searching.  Any ideas? Thx.
> > >
> >
> >
> >
> > --
> > To unsubscribe, e-mail:
> <mailto:[EMAIL PROTECTED]>
> For additional commands, e-mail:
> <mailto:[EMAIL PROTECTED]>


--
To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>

Reply via email to