[
https://issues.apache.org/jira/browse/NUTCH-2801?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17155507#comment-17155507
]
ASF GitHub Bot commented on NUTCH-2801:
---------------------------------------
sebastian-nagel commented on a change in pull request #537:
URL: https://github.com/apache/nutch/pull/537#discussion_r452865637
##########
File path: src/java/org/apache/nutch/protocol/RobotRulesParser.java
##########
@@ -376,13 +379,18 @@ public int run(String[] args) {
*/
private static class TestRobotRulesParser extends RobotRulesParser {
- public TestRobotRulesParser(Configuration conf) {
+ public void setConf(Configuration conf) {
// make sure that agent name is set so that setConf() does not complain,
Review comment:
Thanks. You're right the comment wasn't up-to-date. Would have been
simpler to drop the command-line overwrite of the checked agent names and rely
only on properties. But I didn't want to break the existing behavior.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> RobotsRulesParser command-line checker to use http.robots.agents as fall-back
> -----------------------------------------------------------------------------
>
> Key: NUTCH-2801
> URL: https://issues.apache.org/jira/browse/NUTCH-2801
> Project: Nutch
> Issue Type: Bug
> Components: checker, robots
> Affects Versions: 1.17
> Reporter: Sebastian Nagel
> Assignee: Sebastian Nagel
> Priority: Minor
> Fix For: 1.18
>
>
> The RobotsRulesParser command-line tool, used to check a list of URLs against
> one robots.txt file, should use the value of the property
> {{http.robots.agents}} as fall-back if no user agent names are explicitly
> given as command-line argument. In this case it should behave same as the
> robots.txt parser, looking first for {{http.agent.name}}, then for other
> names listed in {{http.robots.agents}}, finally picking the rules for
> {{User-agent: *}}
> {noformat}
> $> cat robots.txt
> User-agent: Nutch
> Allow: /
> User-agent: *
> Disallow: /
> $> bin/nutch org.apache.nutch.protocol.RobotRulesParser \
> -Dhttp.agent.name=mybot \
> -Dhttp.robots.agents='nutch,goodbot' \
> robots.txt urls.txt
> Testing robots.txt for agent names: mybot,nutch,goodbot
> not allowed: https://www.example.com/
> {noformat}
> The log message "Testing ... for ...: mybot,nutch,goodbot" is misleading.
> Only the name "mybot" is actually checked.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)