Achilleas Mantzios wrote:
> 
> I know i've been bombing you with questions but
> can you please tell me
> 
> a) how many urls aspseek crawls per day (lets say the average case for an
> average 200 kbytes/s connection with the web server, and an average document
> size of (400 words, or 4kbytes) in your own runs??? (VERY IMPORTANT)
> b) Can we run aspseek on multiple machines all feeding up one db??
> c) can you plz send me your aspseek.conf file (of a typical run). What i need
> to see is your initial seed of servers and how you use MaxDocSize, Follow,
> FollowOutSide
> 
> Plz send this info if you can, or i will wait for the person who is currently
> on vacation.

As for b) - no, but we can ask s.cgi to connect to many search daemons, thus
making distributed system, though in a different way. So, you can have one
machine indexing these sites, other indexing those sites, and the third one
running s.cgi with two "DaemonAddress" variables in template, pointing to
two machines. s.cgi will do the work of merging and sorting the results.

As for other questions, yes, only he can answer. Please be patient and
wait...

PS BTW you can subscribe to aseek-users mailing list. Just send mail
to [EMAIL PROTECTED] with the line

subscribe aseek-users

in body.

-- |< [] [] |_    [EMAIL PROTECTED]    http://kir.sever.net   ICQ 7551596 --
There are two ways to write error-free programs; only the third one works.
                                                  (C) 1982, Alan J. Perlis

Reply via email to