[
https://issues.apache.org/jira/browse/NUTCH-1927?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14492419#comment-14492419
]
Sebastian Nagel commented on NUTCH-1927:
----------------------------------------
* http.robot.rules.whitelist should be empty per default :)
* the description says "hostnames or IP addresses" - is IP address white
listing supported?
* instead of repeatedly splitting whitelisted hosts at ',' use
conf.getStrings(...) to initially fill the white list
* also the white list is a set and should be stored as such to avoid iterating
over the list as in isWhiteListed()
* Why is it necessary to create in Fetcher for every URL a new
WhiteListRobotRules object? Wouldn't it be simpler (and more efficient) to use
the existing cache in RobotRulesParser and just put a reference to a singleton
white list rules object if the host is element of the white list?
> Create a whitelist of IPs/hostnames to allow skipping of RobotRules parsing
> ---------------------------------------------------------------------------
>
> Key: NUTCH-1927
> URL: https://issues.apache.org/jira/browse/NUTCH-1927
> Project: Nutch
> Issue Type: New Feature
> Components: fetcher
> Reporter: Chris A. Mattmann
> Assignee: Chris A. Mattmann
> Labels: available, patch
> Fix For: 1.10
>
> Attachments: NUTCH-1927.Mattmann.041115.patch.txt,
> NUTCH-1927.Mattmann.041215.patch.txt
>
>
> Based on discussion on the dev list, to use Nutch for some security research
> valid use cases (DDoS; DNS and other testing), I am going to create a patch
> that allows a whitelist:
> {code:xml}
> <property>
> <name>robot.rules.whitelist</name>
> <value>132.54.99.22,hostname.apache.org,foo.jpl.nasa.gov</value>
> <description>Comma separated list of hostnames or IP addresses to ignore
> robot rules parsing for.
> </description>
> </property>
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)