[ https://issues.apache.org/jira/browse/DROIDS-105?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Bertil Chapuis updated DROIDS-105: ---------------------------------- Fix Version/s: 0.0.2 > missing caching for robots.txt > ------------------------------ > > Key: DROIDS-105 > URL: https://issues.apache.org/jira/browse/DROIDS-105 > Project: Droids > Issue Type: Improvement > Components: core > Reporter: Paul Rogalinski > Fix For: 0.0.2 > > Attachments: Caching-Support-and-Robots_txt-fix.patch, > CachingContentLoader.java > > > the current implementation of the HttpClient will not cache any requests to > the robots.txt file. While using the CrawlingWorker this will result in 2 > requests to the robots.txt (HEAD + GET) per crawled URL. So when crawling 3 > URLs the target server would get 6 requests for the robots.txt. > unfortunately the contentLoader is made final in HttpProtocol, so there is no > possibility to replace it with a caching Protocol like that one you'll find > in the attachment. -- This message is automatically generated by JIRA. - For more information on JIRA, see: http://www.atlassian.com/software/jira