Once segment files are opened by Nutch, they are not closed.

If you're using Linux, you can raise per-process file handle limits by using
ulimit , i.e. ulimit -n 10000

Andy

On 12/21/05, Stefan Groschupf <[EMAIL PROTECTED]> wrote:
>
> You will find many postings about this in the lucene user mailing
> list and also in the lucene wiki.
> The way to go is merging indexes or run several search servers.
>
> Stefan
>
> Am 21.12.2005 um 14:00 schrieb K.A.Hussain Ali:
>
> >
> > Hi all,
> >
> > When i search the segments generated using Nutch ,i get the "Too
> > many open file"
> > error
> >
> > I searched the mailing list and found to do merge segment and index
> > to minimizing the number of segment used for searching.
> >
> > But my doubt is Dont Nutch closes those file after every search ?
> > Do we have any other solution so  this error could be solved ?
> > Even if we reduce the segment or make the file handling size of OS
> > large..there seems to be limit to the number of file opened.
> >
> > Any help is greatly appreciated
> > regards
> > -Hussain
> >
>
>

Reply via email to