Wow, big index, almost 300M documents. It looks like we need this small change that Doug had suggested applied to the code, so I'll do that now.
Otis --- Vince Taluskie <[EMAIL PROTECTED]> wrote: > > Doug, > > This fix worked like a charm! Now it's returning correctly: > > Index /rr/all_indexes/SL contains 291440746 documents > > Thanks for the great toolkit and excellent support. > > Vince > > On Wed, 24 Sep 2003, Doug Cutting wrote: > > > Vince Taluskie wrote: > > > Index /rr/tmpindexes/global/SL contains -245430166 documents > > > > > > 11:53:36,377 ERROR [Engine] StandardWrapperValve[RRSearcher]: > > > Servlet.service() > > > for servlet RRSearcher threw exception > > > java.lang.NegativeArraySizeException > > > at org.apache.lucene.index.SegmentReader.norms(Unknown > Source) > > > > > > I figured I would be fine with the number of documents upto the > 2-4B > > > range - and the data uploads for the project are finished so the > indexes > > > shouldn't need to get larger after this but it looks like I've > hit a > > > limit between 242M-291M documents. > > > > I think the problem is on line 77 of FieldReader.java. Try > replacing > > the line: > > > > size = (int)indexStream.length() / 8; > > > > with: > > > > size = (int)(indexStream.length() / 8); > > > > I believe this will fix the problem. Tell me if it does. > > > > Thanks, > > > > Doug __________________________________ Do you Yahoo!? The New Yahoo! Shopping - with improved product search http://shopping.yahoo.com --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
