Hi,
I crawl ~80k urls in seed, and notice that after depth 2, the number of each 
segment is limited in~200k. this is not possible that all depths (segments) are 
in the same depth.
I changed the sizeFetchlist to be : `expr $numSlaves \* 10000000`
But I saw there is a comment : "250K per task?"
What does it mean? There is limit somewhere that I can't find? Another 
parameter that limit the crawl?

Thanks,
Shani


---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

Reply via email to