On Thu, 04 Mar 2004 15:32:53 -0700 Dmitry Serebrennikov <[EMAIL PROTECTED]> wrote:
> karl wettin wrote: > > >I'm getting this exception, and I can't explain it. It only occurs > >when calling my method that retrieves the content from my index. I > >get the same exception in 1.2 as in 1.3-final, been searching the web > >all over the place but can't find anything else than the same problem > >described, no solution. > > > > > > > >It takes some 50-100 calls to the method until the exception in > >thrown. Don't I close the IndexReader correct? Is there anything else > >I need to close? > > > >Greatful for any hints. > You can tell how many file handles you are allowed to open by your OS > (looks like some flavor of Unix from the paths that you have > included). One way to reduce the number of files Lucene opens is to > use compound indexes (where each index segment uses a single file). > Look for this flag on IndexWriter object. You will have to optimize > existing indexes to convert them. My file limit was set to "unlimited" when I got the problem, so I started using .setUseCompoundFile(true) in order to resolve my problem. Even though I only add new data (no delete) to the index, I started getting the exception "Lock obtain timed out" after the change above. Passing on "-DdisableLuceneLocks=true" to java "solved" this problem, but it does not feel right. I only access Lucene from one JVM, and I make sure I never write and delete at the same time. Do I have anyhting to worry about? -- karl --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]