[ 
https://issues.apache.org/jira/browse/NUTCH-1995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14553219#comment-14553219
 ] 

Sebastian Nagel commented on NUTCH-1995:
----------------------------------------

Hi Chris, it's not about Guiseppe's use case. It's ok to extend the whitelist 
functionality from a list of host names to a list of domains, e.g. 
"*.sample.com". But the patch would allow to easily ignore any robots.txt from 
everywhere:
{noformat}
<property>
  <name>http.robot.rules.whitelist</name>
  <value>*</value>
</property>
{noformat}

I found this old post 
[[1|http://johannburkard.de/blog/www/spam/this-much-nutch-is-too-much-nutch.html#comment9]]
 by Tim:
{quote}
Nutch has no configuration settings to turn off it's 'niceness' like obeying 
robots.txt or not hitting a server multiple times per second, which means that 
anyone using it to be rude is modifying the source code to do so.
{quote}
I agree with Tim that there is a subtle difference between supporting 
impoliteness via a property and forcing "rude" users to fork (which we cannot 
prevent anyway). Ok, Memex is a somewhat different use case (I wouldn't call it 
"rude"). But the normal use case should be that crawler and site owner behave 
cooperative and that implies politeness.

> Add support for wildcard to http.robot.rules.whitelist
> ------------------------------------------------------
>
>                 Key: NUTCH-1995
>                 URL: https://issues.apache.org/jira/browse/NUTCH-1995
>             Project: Nutch
>          Issue Type: Improvement
>          Components: robots
>    Affects Versions: 1.10
>            Reporter: Giuseppe Totaro
>            Assignee: Chris A. Mattmann
>              Labels: memex
>             Fix For: 1.11
>
>         Attachments: NUTCH-1995.patch
>
>
> The {{http.robot.rules.whitelist}} 
> ([NUTCH-1927|https://issues.apache.org/jira/browse/NUTCH-1927]) configuration 
> parameter allows to specify a comma separated list of hostnames or IP 
> addresses to ignore robot rules parsing for.
> Adding support for wildcard in {{http.robot.rules.whitelist}} could be very 
> useful and simplify the configuration, for example, if we need to give many 
> hostnames/addresses. Here is an example:
> {noformat}
> <name>http.robot.rules.whitelist</name>
>   <value>*.sample.com</value>
>   <description>Comma separated list of hostnames or IP addresses to ignore 
>   robot rules parsing for. Use with care and only if you are explicitly
>   allowed by the site owner to ignore the site's robots.txt!
>   </description>
> </property>
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to