Don't see why not.  I think it would be a good
addition. Better to adhere to a delay then to just get
banned altogether :)

--- Doug Cutting <[EMAIL PROTECTED]> wrote:
> Anyone seen these discussions:
> 
> http://www.webmasterworld.com/forum97/93.htm
> http://www.webmasterworld.com/forum93/263.htm
> 
> It looks like Yahoo! and Microsoft's crawlers
> support a non-standard 
> extension to robots.txt that permits the crawl delay
> time to be 
> specified.  Should we add support for this to Nutch?
> 
> Doug
> 
> 
>
-------------------------------------------------------
> This SF.Net email sponsored by Black Hat Briefings &
> Training.
> Attend Black Hat Briefings & Training, Las Vegas
> July 24-29 - 
> digital self defense, top technical experts, no
> vendor pitches, 
> unmatched networking opportunities. Visit
> www.blackhat.com
> _______________________________________________
> Nutch-developers mailing list
> [EMAIL PROTECTED]
>
https://lists.sourceforge.net/lists/listinfo/nutch-developers
> 



-------------------------------------------------------
This SF.Net email sponsored by Black Hat Briefings & Training.
Attend Black Hat Briefings & Training, Las Vegas July 24-29 - 
digital self defense, top technical experts, no vendor pitches, 
unmatched networking opportunities. Visit www.blackhat.com
_______________________________________________
Nutch-developers mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/nutch-developers

Reply via email to