> I believe that my database will end up around 2-3 million pages
> before done. What hardware are others running to make this work in a
> timely fashion?

i tested things on an Alpha running NetBSD/Alpha-current with 1G RAM
500MHz single processor. 3.1.12-cvs cachemode postgresql. i've been
testing since 3.1.8. with pgsql index tuning and such, i've gotten
average search time down to less than a second with over 1 million urls
indexed. 

however, there are repeatable (and reported) crashes in indexer,
cachelogd, splitter that would prevent mnogosearch from being used in a
production environment with large number of indexed sites. if you have
a few thousand pages, it works wonderfully. when you get into the
hundreds of thousands or million of urls you have to index, you will
see core dumps gallore.

remember, this is under cachemode which is the fastest mode. i've yet
to test crc-multi under sql for that large # of urls.

__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail. 
http://personal.mail.yahoo.com/
___________________________________________
If you want to unsubscribe send "unsubscribe general"
to [EMAIL PROTECTED]

Reply via email to