support for Crawl-delay in Robots.txt
-------------------------------------
Key: NUTCH-293
URL: http://issues.apache.org/jira/browse/NUTCH-293
Project: Nutch
Type: Improvement
Components: fetcher
Versions: 0.8-dev
Reporter: Stefan Groschupf
Priority: Critical
Nutch need support for Crawl-delay defined in robots.txt, it is not a standard
but a de-facto standard.
See:
http://help.yahoo.com/help/us/ysearch/slurp/slurp-03.html
Webmasters start blocking nutch since we do not support it.
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
http://www.atlassian.com/software/jira
-------------------------------------------------------
All the advantages of Linux Managed Hosting--Without the Cost and Risk!
Fully trained technicians. The highest number of Red Hat certifications in
the hosting industry. Fanatical Support. Click to learn more
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=107521&bid=248729&dat=121642
_______________________________________________
Nutch-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-developers