[Nutch-dev] [jira] Commented: (NUTCH-446) RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt

2007-05-10 Thread JIRA

[ 
https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12494734
 ] 

Doğacan Güney commented on NUTCH-446:
-

So, does anyone have objections to this? It fixes an annoying (albeit rare) bug 
in which Nutch doesn't fetch pages even though it is alllowed to, or behave too 
polite/impolite. And it doesn't seem to break anything.

 RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
 -

 Key: NUTCH-446
 URL: https://issues.apache.org/jira/browse/NUTCH-446
 Project: Nutch
  Issue Type: Bug
  Components: fetcher
Affects Versions: 0.9.0
Reporter: Doğacan Güney
Priority: Minor
 Fix For: 1.0.0

 Attachments: crawl-delay.patch, crawl-delay_test.patch


 RobotRulesParser doesn't check for addRules when reading the crawl-delay 
 value, so the nutch bot will get the crawl-delay value of another robot's 
 crawl-delay in robots.txt. 
 Let me try to be more clear:
 User-agent: foobot
 Crawl-delay: 3600
 User-agent: *
 Disallow: /baz
 In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
 value, no matter what nutch bot's name actually is.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Nutch-developers mailing list
Nutch-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/nutch-developers


[Nutch-dev] [jira] Commented: (NUTCH-446) RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt

2007-05-01 Thread Sami Siren (JIRA)

[ 
https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12492850
 ] 

Sami Siren commented on NUTCH-446:
--

+1

 RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
 -

 Key: NUTCH-446
 URL: https://issues.apache.org/jira/browse/NUTCH-446
 Project: Nutch
  Issue Type: Bug
  Components: fetcher
Affects Versions: 0.9.0
Reporter: Doğacan Güney
Priority: Minor
 Fix For: 1.0.0

 Attachments: crawl-delay.patch, crawl-delay_test.patch


 RobotRulesParser doesn't check for addRules when reading the crawl-delay 
 value, so the nutch bot will get the crawl-delay value of another robot's 
 crawl-delay in robots.txt. 
 Let me try to be more clear:
 User-agent: foobot
 Crawl-delay: 3600
 User-agent: *
 Disallow: /baz
 In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
 value, no matter what nutch bot's name actually is.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Nutch-developers mailing list
Nutch-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/nutch-developers