On 18.03.2013 18:31, Simon Legner wrote: > On Mon, 2013-03-18 at 17:46 +0100, Dirk Stöcker wrote:
Thanks for your work. >> Actually the amount of connects by JOSM itself can be ignored >> compared to the web spiders, SPAM bots, hacker attempts and all the >> other things accessing a webpage nowadays. And the wiki, ticket and >> svn stuff is pretty dynamic. >> >> If only real users would access the webpage we would have less >> trouble. Blocking stuff for spiders on the other hand is also no >> good idea, as web search engines like Google are the main entrypoint >> to the pages (f.e. we have pretty good clicktrough rates with >> Google.) To have good performance for the users there must be a lot >> performance left to serve all the others. Some service like search do not need to be crawled > What about re-generating this file on modification and shipping as fast > as possible (using Apache directly, Nginx or Varsnish). Maybe, having a > subdomain for static content might be worth a try (getting rid of > cookies, sessions etc.). Thought trac is using caching itself already. Good setup would be nginx + varnish and then trac. nginx could offer http and https and looping internal on varnish with http (maybe socks). Definitely, a bit more complicated than using apache and only theory. Just my two cents Colliar _______________________________________________ josm-dev mailing list josm-dev@openstreetmap.org http://lists.openstreetmap.org/listinfo/josm-dev