A while back I posted a message about indexing 2+ billion documents and got
a detailed and good response.  Thank you.  That project is on hold due to
the 600+ servers it would take to do the job properly. ;)

Another project I am looking at expanding would involve probably 40 million
documents or so located at a series of various URL's.

Would the following scenario work ok?

(8) servers running ASPSeek and doing indexing of various URL's.  Each set
of URL's would be unique on each server.

(2) search front ends running ASPSeek that have the IP list for the 8
searchd's running.

I realize this is pretty simple, but has anyone actually tested running
s.cgi across multiple searchd's?  How fast is it and what kind of numbers
are you running?

Does this seem reasonable to try?

Thanks,

Paul Stewart

Reply via email to