RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
-----------------------------------------------------------------------------
Key: NUTCH-446
URL: https://issues.apache.org/jira/browse/NUTCH-446
Project: Nutch
Issue Type: Bug
Components: fetcher
Affects Versions: 0.9.0
Reporter: Doğacan Güney
Priority: Minor
Fix For: 0.9.0
Attachments: crawl-delay.patch
RobotRulesParser doesn't check for addRules when reading the crawl-delay value,
so the nutch bot will get the crawl-delay value of another robot's crawl-delay
in robots.txt.
Let me try to be more clear:
User-agent: foobot
Crawl-delay: 3600
User-agent: *
Disallow: /baz
In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.