If you set http.max.delays to 300, then you have to try and achieve a
balance with  fetcher.server.delay.  By default its set to 5.  If for
some reason the thread fails everytime to crawl, then it would have
waited 1500 seconds before it fails and gives up -- ofcourse affecting
performance

Good luck

On 7/27/05, Feng (Michael) Ji <[EMAIL PROTECTED]> wrote:
> Hi there:
> 
> I checked log file, found some site link met error
> as"Exceeded http.max.delays: retry later"
> 
> I change the corresponding value in conf file,
> nutch-default.xml, I changed it to 300, seems still
> not enough. Will that affect performance of crawling?
> 
> Any idea?
> 
> thanks,
> 
> Michael
> 
> __________________________________________________
> Do You Yahoo!?
> Tired of spam?  Yahoo! Mail has the best spam protection around
> http://mail.yahoo.com
>

Reply via email to