Hi there...

I'm looking at a future project of indexing close to 2 BILLION documents..
seriously... and they will all be html pages etc.

I haven't ran ASPSeek across multiple machines but read that you can list a
series of IP's for s.cgi to search across... Does this work ok and what kind
of performance could be obtained (best guess?).  My average search time is
about .05 seconds currently with 3.2 million documents.

If I put 250 servers onto a very very fast ethernet setup along with fast
scsi disks etc. could it be possible to search the combination of these
machines and have accurate search results in a fast response time?

It's a similar concept to Google's setup (last time I heard was 600 machines
running for about 2 billion documents)...

Any ideas?  Anyone using multiple machines?  If you are using multiple
machines I'd love to know hardware details, url if possible, and response
times you are seeing... along with the number of documents you have on each
server and total url's etc.... details are important....

Thanks for your time... appreciate it...

Paul Stewart

Reply via email to