Well, I really don't have a clue what'll happen with that many documents. It's more a matter of unique terms from what I understand.
I'll be *really* curious how it turns out. Erick On Thu, Mar 6, 2008 at 6:03 PM, Ray <[EMAIL PROTECTED]> wrote: > > Thanks for your answer. > > Well I want to search around 6 billion documents. > Most of them very small, but I am confident to be hitting > that number in the long run. > > I am currently running a small random text indexer with 400 docs/second. > It will reach 2 billion in around 45 days. > > I really hope you all who are saying 2 billion docs > will bring lucene to its knees are wrong... > > Ray. > > ----- Original Message ----- > From: "Erick Erickson" <[EMAIL PROTECTED]> > To: <java-user@lucene.apache.org> > Sent: Thursday, March 06, 2008 10:40 PM > Subject: Re: MultiSearcher to overcome the Integer.MAX_VALUE limit > > > > Well, I'm not sure. But any index, even one split amongst many nodes > > is going to have some interesting performance characteristics if you > > have over 2 billion documents.... So I'm not sure it matters <G>... > > > > What problem are you really trying to solve? You'll probably get > > more meaningful answers if you tell us what that is. > > > > Best > > Erick > > > > On Thu, Mar 6, 2008 at 10:23 AM, Ray <[EMAIL PROTECTED]> wrote: > > > >> Hey Guys, > >> > >> just a quick question to confirm an assumption I have. > >> > >> Is it correct that I can have around 100 Indexes each at its > >> Integer.MAX_VALUE limit of documents, but can happily > >> search them all with a MultiSearcher if all combined returned > >> hits don't add up to the Integer.MAX_VALUE themselves ? > >> > >> Kind regards, > >> > >> Ray. > >> > >> > >> > > > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [EMAIL PROTECTED] > For additional commands, e-mail: [EMAIL PROTECTED] > >