If you set http.max.delays to 300, then you have to try and achieve a
balance with  fetcher.server.delay.  By default its set to 5.  If for
some reason the thread fails everytime to crawl, then it would have
waited 1500 seconds before it fails and gives up -- ofcourse affecting
performance

Good luck

On 7/27/05, Feng (Michael) Ji <[EMAIL PROTECTED]> wrote:
> Hi there:
> 
> I checked log file, found some site link met error
> as"Exceeded http.max.delays: retry later"
> 
> I change the corresponding value in conf file,
> nutch-default.xml, I changed it to 300, seems still
> not enough. Will that affect performance of crawling?
> 
> Any idea?
> 
> thanks,
> 
> Michael
> 
> __________________________________________________
> Do You Yahoo!?
> Tired of spam?  Yahoo! Mail has the best spam protection around
> http://mail.yahoo.com
>


-------------------------------------------------------
SF.Net email is Sponsored by the Better Software Conference & EXPO September
19-22, 2005 * San Francisco, CA * Development Lifecycle Practices
Agile & Plan-Driven Development * Managing Projects & Teams * Testing & QA
Security * Process Improvement & Measurement * http://www.sqe.com/bsce5sf
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to