On Apr 20, 11:06 pm, Vyacheslav Akhmechet <[email protected]> wrote: > On Mon, Apr 20, 2009 at 4:13 AM, Leslie P. Polzer<[email protected]> > wrote: > > You're missing my point. One of our basic axioms is that dynamic pages > > should come fresh from the server and must not be cached on the client. > > I didn't realize this is a basic axiom. It just isn't how the web works.
Please elaborate; to my knowledge caching is a very sensitive thing where there's no blanket answer -- it's rather that you have to decide for every part of your web application how long it should be cached (if it is to be cached at all). This is true for the most basic CGI script working with data: If you invoke a stale page that gives you the possibility to delete object X which no longer exists then you've got the same problem as we have here. > That's true, but in case of back buttons users expect some kind of > support, they're happy with something that doesn't work all the time > but kind of works some of the time. By garbage collecting the actions, > you're breaking that expectation, and I think it's pretty important. > It might be worth doing if you get something significant, but I don't > see any significant benefits. I'm still amiable for scrapping this (or rather have it turned off by default) because the performance gains are low, but I still don't fully understand what it is you're fretting about. AFAICS you'd need to show me how you'd sensibly use caching for dynamic parts of an Weblocks app (and why that would be a great performance gain since I guess that most of the traffic is JS, CSS and images). --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "weblocks" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/weblocks?hl=en -~----------~----~----~----~------~----~------~--~---
