Thanks
Paladin wrote:
I use several collections, one of 1 200 000 documents, one of 3 800 000 and another one of 12 000 000 documents (for the biggests) and the performances are quite good (except for search with wildcards). Our machine have 1 giga bites of memory and 2 CPU.
----- Original Message ----- From: "Mark Devaney" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, March 10, 2004 4:26 PM
Subject: Large document collections?
I'm looking for information on the largest document collection that Lucenefar
has been used to index, the biggest benchmark I've been able to find so
is 1MM documents.records
I'd like to generate some benchmarks for large collections (1-100MM)
and would like to know if this is feasible without using distributeddocuments.
indexes, etc. It's mostly to construct a performance profile relating
indexing/retrieval time and storage requirements to the number of
Thanks.
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
-- Albert Vila Puig http://www.imente.com [iMente, El mayor agregador de titulares en espa�ol] Le invitamos a visitar nuestra nueva web y probar nuestros servicios
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
