[ 
https://issues.apache.org/jira/browse/NUTCH-2801?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17155472#comment-17155472
 ] 

ASF GitHub Bot commented on NUTCH-2801:
---------------------------------------

sebastian-nagel opened a new pull request #537:
URL: https://github.com/apache/nutch/pull/537


   - if no agent names are given as command-line arguments use values 
ofhttp.agent.name and http.robots.agents as agent names to be checked
   - update command-line help
   
   ```
   $> nutch org.apache.nutch.protocol.RobotRulesParser \
         -Dhttp.agent.name='mybot' \
         -Dhttp.robots.agents='nutch,goodbot' \
         robots.txt urls.txt 
   Testing robots.txt for agent names: mybot,nutch,goodbot
   allowed:        https://www.example.com/
   
   # command-line overwrite:
   $> nutch org.apache.nutch.protocol.RobotRulesParser \
         -Dhttp.agent.name='mybot' \
         -Dhttp.robots.agents='nutch,goodbot' \
         robots.txt urls.txt \
         badbot,anybot
   Testing robots.txt for agent names: badbot,anybot
   not allowed:    https://www.example.com/
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


> RobotsRulesParser command-line checker to use http.robots.agents as fall-back
> -----------------------------------------------------------------------------
>
>                 Key: NUTCH-2801
>                 URL: https://issues.apache.org/jira/browse/NUTCH-2801
>             Project: Nutch
>          Issue Type: Bug
>          Components: checker, robots
>    Affects Versions: 1.17
>            Reporter: Sebastian Nagel
>            Assignee: Sebastian Nagel
>            Priority: Minor
>             Fix For: 1.18
>
>
> The RobotsRulesParser command-line tool, used to check a list of URLs against 
> one robots.txt file, should use the value of the property 
> {{http.robots.agents}} as fall-back if no user agent names are explicitly 
> given as command-line argument. In this case it should behave same as the 
> robots.txt parser, looking first for {{http.agent.name}}, then for other 
> names listed in {{http.robots.agents}}, finally picking the rules for 
> {{User-agent: *}}
> {noformat}
> $> cat robots.txt
> User-agent: Nutch
> Allow: /
> User-agent: *
> Disallow: /
> $> bin/nutch org.apache.nutch.protocol.RobotRulesParser \
>       -Dhttp.agent.name=mybot \
>       -Dhttp.robots.agents='nutch,goodbot' \
>       robots.txt urls.txt 
> Testing robots.txt for agent names: mybot,nutch,goodbot
> not allowed:    https://www.example.com/
> {noformat}
> The log message "Testing ... for ...: mybot,nutch,goodbot" is misleading. 
> Only the name "mybot" is actually checked.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to