Paul J. Lucas wrote:
On May 30, 2008, at 3:05 AM, Michael McCandless wrote:
Are you indexing only one document each time you open IndexWriter?
Or do you open a single IndexWriter, add all documents for that
directory, then close it?
The latter.
When the exception occurs, do you know how many simultaneous threads
are doing searching? I realize you said it's extremely light load,
but if it's possibly a good number of threads, and combined with a
large mergeFactor, that would explain the exhaustion.
I don't know, but the answer is probably either 0 or 1. I forgot if I
mentioned this before, but there is exactly 1 client for my server.
Most of the time, the number of queries is 0 because the client is
quiescent. A query only happens when the user (using the client)
manually initiates a query. (I don't work on the client code, so I'm
not totally sure, but the client may also initiate several queries at
once when getting information for all the files in a directory. But
even then, we're talking only about a handful of threads.)
The exception always happens when I call close() after unindexing the
contents of a directory.
Do you know what your descriptor limit actually is? You can use this
simple JUnit test (from the upcoming Lucene in Action revision) to
check:
10237.
- Paul
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
Too many files open is def not the issue then. Actually, I can only get
a segments file missing with too many files, not one of the other files
as you are seeing. I cant get your exception unless I use two
IndexWriters to corrupt the index.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]