[
https://issues.apache.org/jira/browse/NUTCH-1927?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14510167#comment-14510167
]
Hudson commented on NUTCH-1927:
-------------------------------
FAILURE: Integrated in Nutch-trunk #3084 (See
[https://builds.apache.org/job/Nutch-trunk/3084/])
Add back in NUTCH-1927 property to nutch-default as revoved during commit
@1675022 (lewismc:
http://svn.apache.org/viewvc/nutch/trunk/?view=rev&rev=1675735)
* /nutch/trunk/conf/nutch-default.xml
> Create a whitelist of IPs/hostnames to allow skipping of RobotRules parsing
> ---------------------------------------------------------------------------
>
> Key: NUTCH-1927
> URL: https://issues.apache.org/jira/browse/NUTCH-1927
> Project: Nutch
> Issue Type: New Feature
> Components: fetcher
> Reporter: Chris A. Mattmann
> Assignee: Chris A. Mattmann
> Labels: available, patch
> Fix For: 1.10
>
> Attachments: NUTCH-1927.2015-04-16.patch,
> NUTCH-1927.2015-04-17.patch, NUTCH-1927.Mattmann.041115.patch.txt,
> NUTCH-1927.Mattmann.041215.patch.txt, NUTCH-1927.Mattmann.041415.patch.txt,
> test_NUTCH-1927.2015-04-17.txt
>
>
> Based on discussion on the dev list, to use Nutch for some security research
> valid use cases (DDoS; DNS and other testing), I am going to create a patch
> that allows a whitelist:
> {code:xml}
> <property>
> <name>robot.rules.whitelist</name>
> <value>132.54.99.22,hostname.apache.org,foo.jpl.nasa.gov</value>
> <description>Comma separated list of hostnames or IP addresses to ignore
> robot rules parsing for.
> </description>
> </property>
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)