Hello Team,

I am running an experimental cluster of 5 0.8-dev (2006-03-18) fetchers, and
intending to crawl 100 Million pages.

I understand that a balance must be achieved between having too few and too many
segments on the searchers.  I intend to

- run 5 searchers
- each responsible for 20M pages
- with 10M pages per segment (2 segments per searcher)

However, I am nervous about fetching 10M pages at a time because of the pain
that would result if a fetch cycle failed, and would prefer to fetch smaller
1M-page segments if there is a way to combine the segments afterwards.  Is
there a way to do that?

BTW, I do understand that multi-segment indexes can be merged, so I'm only
asking about the segment data themselves.

Many thanks,

Monu Ogbe




-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to