[ 
https://issues.apache.org/jira/browse/DROIDS-105?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12935493#action_12935493
 ] 

Paul Rogalinski commented on DROIDS-105:
----------------------------------------

@Javier

Then use the 2nd java class attached, this does exactly that. Still uses 
Commons-Collections though, must have slipped through my changes detection 
somehow, sorry about that. Not easy running 3 copies of droids in different 
version to get clean patches out of the build-system :)

Oh, the 2nd solution needs to be adapted to the droids package naming and also 
reformatted to match the code guidelines. I am sure you have that on a 
keyboard-shortcut :)

> missing caching for robots.txt
> ------------------------------
>
>                 Key: DROIDS-105
>                 URL: https://issues.apache.org/jira/browse/DROIDS-105
>             Project: Droids
>          Issue Type: Improvement
>          Components: core
>            Reporter: Paul Rogalinski
>         Attachments: Caching-Support-and-Robots_txt-fix.patch, 
> CachingContentLoader.java
>
>
> the current implementation of the HttpClient will not cache any requests to 
> the robots.txt file. While using the CrawlingWorker this will result in 2 
> requests to the robots.txt (HEAD + GET) per crawled URL. So when crawling 3 
> URLs the target server would get 6 requests for the robots.txt.
> unfortunately the contentLoader is made final in HttpProtocol, so there is no 
> possibility to replace it with a caching Protocol like that one you'll find 
> in the attachment.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to