[
https://issues.apache.org/jira/browse/NUTCH-98?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Markus Jelsma closed NUTCH-98.
------------------------------
Resolution: Won't Fix
Bulk close of legacy issues:
http://www.lucidimagination.com/search/document/2738eeb014805854/clean_up_open_legacy_issues_in_jira
> RobotRulesParser interprets robots.txt incorrectly
> --------------------------------------------------
>
> Key: NUTCH-98
> URL: https://issues.apache.org/jira/browse/NUTCH-98
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.7
> Reporter: Jeff Bowden
> Priority: Minor
> Attachments: RobotRulesParser.java.diff
>
>
> Here's a simple example that the current RobotRulesParser gets wrong:
> User-agent: *
> Disallow: /
> Allow: /rss
> The problem is that the isAllowed function takes the first rule that matches
> and incorrectly decides that URLs starting with "/rss" are Disallowed. The
> correct algorithm is to take the *longest* rule that matches. I will attach
> a patch that fixes this.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira