Have same problem: all pages a responsed with 200 code. First of all i'd check: SiteController.cache_timeout ~1 day Radiant::Config['dev.host'] is different from production Page.find_by_parent_id(nil).cache? true
Header of response: Server nginx Date Wed, 02 Feb 2011 09:47:49 GMT Content-Type text/css; charset=utf-8 Etag "38be3c15a36887587a1525cfaa502e2d" X-Rack-Cache fresh X-Content-Digest ea5fdfe604039430439328a42d4145db2b392311 X-Runtime 138 Content-Length 2180 Cache-Control max-age=2592000, public Age 381 "Cache" tab in firebug for this page: Last Modified Wed Feb 02 2011 12:06:15 GMT+0200 (MSK) Last Fetched Wed Feb 02 2011 12:07:14 GMT+0200 (MSK) Expires Fri Mar 04 2011 11:47:49 GMT+0200 (MSK) Data Size 2180 Fetch Count 22 Device memory Same result I receive for all pages (include css and js driven by radiant) an on each loading; Radiant version 0.9.2 Any suggestion? Is it right, that tmp/cache/meta file include user-related information like HTTP_COOKIE (with google analytic cookies information) and HTTP_CACHE_CONTROL "max-age=0" On Jan 26, 10:50 pm, Carl Youngblood <c...@youngbloods.org> wrote: > Anton, I'm not denying that the site could be optimized significantly, > but those optimizations are orders of magnitude less important than > fixing this problem I'm having with 30s refreshes. Until I resolve > that, all these other techniques are superfluous. As they say, work on > the bottlenecks. Besides, these images are being cached just fine by > repeat visits. But the caching problem we're having is making the site > unusable. Your suggestions are right on the money; I have been > studying these techniques as well, but I'd like to deal with this > larger problem first. The fact remains that two refreshes in a row > both get 200 responses, so rack-cache is not working or else something > in radiant is screwing things up. Thanks for your suggestions. > > On Wed, Jan 26, 2011 at 9:21 PM, Anton J Aylward > > <radi...@antonaylward.com> wrote: > > Carl Youngblood said the following on 01/25/2011 05:01 PM: > > >> to respond to Anton's previous comments, firebug shows that nearly all > >> the delay is spent waiting for the server. > > > Of course it is, but that isn't what I meant. > > > You have a lot of images of your page. > > You have javascript and CSS. > > You also have in-line Javascript and in-line CSS which is a drag on > > performance! > > > You have images that sequence. That big block run under javascript. > > Why not combine those images into one and use a CSS sprite? Yes, a > > bigger download but only ONE download. And there are various ways you > > can optimise the image so that its size is actually less than the size > > of the sum of the images. > > > Your page layout is such that your browser cannot render ANYTHING until > > it has all arrived from the server. If it has to scale images or wait > > for (all) the images to arrive to work out the layout then things are > > even worse! > > > Of your "30 seconds", 27 of them are the latency of downloading those > > HTML and embedded CSS images. You have over 60 images on that page. > > > If they are fetched sequentially, waiting for one to complete before > > fetching the next, you have a worst-case situation. You should set up > > your browser to have multiple persistent parallel connections and fetch > > them in parallel. IE8 defualts to only 2 parallel connections, which > > for the type of page you have is crippling. > > > Explaiantion at > >http://en.wikipedia.org/wiki/HTTP_persistent_connection > > > There is also a differnece in how http/1.0 and http/1.1 handle this. > > I see you are using 1.0. > > > See > >https://addons.mozilla.org/en-US/firefox/addon/extra-fasterfoxx/ > > whcih tweaks these settings: > > user_pref("network.http.max-connections", 40); > > user_pref("network.http.max-connections-per-server", 16); > > user_pref("network.http.max-persistent-connections-per-proxy", 12); > > user_pref("network.http.max-persistent-connections-per-server", 6); > > user_pref("nglayout.initialpaint.delay", 0); > > > However this only makes sense if your Apapche is configured with http > > 'keep-alive' > >http://httpd.apache.org/docs/1.3/keepalive.html > > > See also > >http://www.technewsworld.com/story/it-management/64924.html?wlc=12959... > > > These pages on page design and layout address my other points > >http://developer.yahoo.com/performance/rules.html > > As it says: > > <quote> > > Reducing the number of HTTP requests in your page is the place to start. > > This is the most important guideline for improving performance for first > > time visitors > > </quote> > > You have a killer number of requests on your front page! > > > Layout of the elements of the page can also be critical > > Take a look at the Cuzillion design tool to optimize your layout. > >http://stevesouders.com/cuzillion/ > > > Looking at the soruce of your pages I see you are very ehavy on your DOM > > elements. You keep using constructs like > > > <div .... content-...-outer ..> > > <div .... content-...-inner ..> > > > These could be merged in the CSS, making the CSS smaller (cna why not > > compress it?) and fewer DOM elelment for the browser to deal with. > > > The number of DOM elements is easy to test, just type in Firebug's console: > > document.getElementsByTagName('*').length > > > -- > > Wherever you see a successful business, someone once made a courageous > > decision. > > --Peter F. Drucker > >