Re: Can client cache pages effectively?

2009-04-10 Thread Jeremy Thomerson
Still sounds like you're jumping through hoops to force this HTML caching to
fit - possibly opening up security vulnerabilities by exposing a user's role
in the URL - which should only be in the session.  I maintain that you'd be
better off caching the data - that's the expensive part anyway.

But it's your app and that's my two cents.  :)

--
Jeremy Thomerson
http://www.wickettraining.com



On Thu, Apr 9, 2009 at 11:29 AM, Jim Pinkham pinkh...@gmail.com wrote:

 A quick follow up in case anyone else was curious about how this is going:

 I ended up using ehcache page cache filter for a simple page that just
 displays 'current' items (calendar view of events) based on a db query.  No
 forms (state) on this page so it works pretty well.  In my DAO that does
 updates, I clear the cache.  Very simple, works great.  (Only catch I ran
 into is that my menus change when I have a session and I'm logged in as
 super-user, so I have to make sure I don't let that version of the page be
 cached - I do that by adding 'super' page parameter so URL is different
 and filter is set to only cache the 'normal' version.

 So, that still leaves me with my main catalog page, which is primarily a
 similar list of items, but it also has some active content (in particular,
 a
 search form).

 So my bright idea (tm)  (i.e. I'd love to hear critiques before I get too
 far along with it) is the following:

 Make a new page for just the data grid, with page parameters including the
 search string and last-modified date (and super-user login because I get
 some edit links and such with that).  Mount it and ehcache it, and override
 setHeader so it becomes client cache-able.   Then, my outer catalog page
 with the search form on it just uses an IFrame to display the grid data
 (easy to keep track of last-modified globally).  Same clear method in DAO
 dumps the cache whenever a change is made.

 Also, I'd want to make a robot no-follow thing to avoid google trap on that
 page.  Could this actually be a legitimate use of otherwise dodgy IFRAME ?

 Sound like a good plan?

 Thanks,
 -- Jim.

 On Fri, Mar 27, 2009 at 1:56 PM, Jim Pinkham pinkh...@gmail.com wrote:

  Jeremy,
 
  Thanks for your thoughtful reply - Scenario is exactly right.
  I played around with page headers to make the whole page cacheable, but
 ran
  into several problems - I have a search form, and there's an 'admin'
 login
  that enables edit links.  So it's really a stateful page, but I want to
  speed up the most common state.
 
  The bulk of the content is from an AjaxFallbackDefaultDataTable with
  sortable columns. I re-sorted a column with the Ajax Debug window open to
  measure it's data size - about 225000 chars.  My database search takes
  64ms.  Overall client repaint time is about 2 sec with browser on
  localhost.  I haven't found the right hook to measure total wicket
 response
  time yet, but it appears pretty quick - so that's why I thougth it made
  sense to focus on client caching.
 
  Before I give up entirely on this idea, I'm wondering if it might make
  sense to make the grid a public Resource, which I'm hoping the browser
 would
  treat like an image.  I can afford a separate db query to just get my
  max(lastModified), which might let me save the time to generate HTML,
 which
  looks as though it could be my bottleneck.  If this way is too hard, I'll
  give up, but it sounds do-able - what do you think?
  Thanks,
  -- Jim.
 
 
  On Thu, Mar 26, 2009 at 6:30 PM, Jeremy Thomerson 
  jer...@wickettraining.com wrote:
 
  How is this going to help you?  Scenario as I understand it:
 
 
1. User requests homepage - pulls from site - with your etag in it
2. User requests homepage again - calls site - your server does all of
the loading of data - then you calculate / set etag
3. Browser now knows that it is the same as before and does not have
 to
pull the HTML down
 
  The user saves what is likely a very short time in the overall scheme of
  things - downloading the HTML.
  The user still has to sit through the process of you loading the data
 from
  the search / DB / etc. and generating HTML
  Your server saves no load - but a little bandwidth.
 
  I'd look at caching before it even gets to your server.  Otherwise your
  user
  will likely not see much benefit unless you are sending multiple MB of
  data
  back.  Sounds like premature optimization to me.
 
  --
  Jeremy Thomerson
  http://www.wickettraining.com
 
 
 
   On Thu, Mar 26, 2009 at 5:26 PM, Jim Pinkham pinkh...@gmail.com
 wrote:
 
   Thanks Jerry; I think that applies only to static pages.
  
   My next idea is to try overridding WebPage.setHeaders and just set the
  
   response.setHeader(Cache-Control, max-age=3600, must-revalidate);
  
   response.setHeader(ETag, 1);  // I'll use a checksum on the data
  coming
   back from my search (Even better would be a checksum on the rendered
  page
   data - any idea how to do that?)
   Initial test (above) seems promising...
 

Re: Can client cache pages effectively?

2009-04-09 Thread Jim Pinkham
A quick follow up in case anyone else was curious about how this is going:

I ended up using ehcache page cache filter for a simple page that just
displays 'current' items (calendar view of events) based on a db query.  No
forms (state) on this page so it works pretty well.  In my DAO that does
updates, I clear the cache.  Very simple, works great.  (Only catch I ran
into is that my menus change when I have a session and I'm logged in as
super-user, so I have to make sure I don't let that version of the page be
cached - I do that by adding 'super' page parameter so URL is different
and filter is set to only cache the 'normal' version.

So, that still leaves me with my main catalog page, which is primarily a
similar list of items, but it also has some active content (in particular, a
search form).

So my bright idea (tm)  (i.e. I'd love to hear critiques before I get too
far along with it) is the following:

Make a new page for just the data grid, with page parameters including the
search string and last-modified date (and super-user login because I get
some edit links and such with that).  Mount it and ehcache it, and override
setHeader so it becomes client cache-able.   Then, my outer catalog page
with the search form on it just uses an IFrame to display the grid data
(easy to keep track of last-modified globally).  Same clear method in DAO
dumps the cache whenever a change is made.

Also, I'd want to make a robot no-follow thing to avoid google trap on that
page.  Could this actually be a legitimate use of otherwise dodgy IFRAME ?

Sound like a good plan?

Thanks,
-- Jim.

On Fri, Mar 27, 2009 at 1:56 PM, Jim Pinkham pinkh...@gmail.com wrote:

 Jeremy,

 Thanks for your thoughtful reply - Scenario is exactly right.
 I played around with page headers to make the whole page cacheable, but ran
 into several problems - I have a search form, and there's an 'admin' login
 that enables edit links.  So it's really a stateful page, but I want to
 speed up the most common state.

 The bulk of the content is from an AjaxFallbackDefaultDataTable with
 sortable columns. I re-sorted a column with the Ajax Debug window open to
 measure it's data size - about 225000 chars.  My database search takes
 64ms.  Overall client repaint time is about 2 sec with browser on
 localhost.  I haven't found the right hook to measure total wicket response
 time yet, but it appears pretty quick - so that's why I thougth it made
 sense to focus on client caching.

 Before I give up entirely on this idea, I'm wondering if it might make
 sense to make the grid a public Resource, which I'm hoping the browser would
 treat like an image.  I can afford a separate db query to just get my
 max(lastModified), which might let me save the time to generate HTML, which
 looks as though it could be my bottleneck.  If this way is too hard, I'll
 give up, but it sounds do-able - what do you think?
 Thanks,
 -- Jim.


 On Thu, Mar 26, 2009 at 6:30 PM, Jeremy Thomerson 
 jer...@wickettraining.com wrote:

 How is this going to help you?  Scenario as I understand it:


   1. User requests homepage - pulls from site - with your etag in it
   2. User requests homepage again - calls site - your server does all of
   the loading of data - then you calculate / set etag
   3. Browser now knows that it is the same as before and does not have to
   pull the HTML down

 The user saves what is likely a very short time in the overall scheme of
 things - downloading the HTML.
 The user still has to sit through the process of you loading the data from
 the search / DB / etc. and generating HTML
 Your server saves no load - but a little bandwidth.

 I'd look at caching before it even gets to your server.  Otherwise your
 user
 will likely not see much benefit unless you are sending multiple MB of
 data
 back.  Sounds like premature optimization to me.

 --
 Jeremy Thomerson
 http://www.wickettraining.com



  On Thu, Mar 26, 2009 at 5:26 PM, Jim Pinkham pinkh...@gmail.com wrote:

  Thanks Jerry; I think that applies only to static pages.
 
  My next idea is to try overridding WebPage.setHeaders and just set the
 
  response.setHeader(Cache-Control, max-age=3600, must-revalidate);
 
  response.setHeader(ETag, 1);  // I'll use a checksum on the data
 coming
  back from my search (Even better would be a checksum on the rendered
 page
  data - any idea how to do that?)
  Initial test (above) seems promising...
 
  Thanks,
  -- Jim.
 
  On Thu, Mar 26, 2009 at 3:22 PM, Jeremy Thomerson 
  jer...@wickettraining.com
   wrote:
 
   Have you looked at a standard HTTP caching proxy like
   http://www.squid-cache.org/ ?
  
  
   --
   Jeremy Thomerson
   http://www.wickettraining.com
  
  
  
   On Thu, Mar 26, 2009 at 2:02 PM, Jim Pinkham pinkh...@gmail.com
 wrote:
  
Changing my search query to this got some better hits:
http://lmgtfy.com/?q=cacheability
So, allow me to refine my question based on that - has anyone tried
  some
   of
these approaches (see first 

Re: Can client cache pages effectively?

2009-04-09 Thread Eduardo Nunes
Usually IFrame isn't a good idea, I think it's because of the
accessibility and search engines. At least that is what the major part
of the HTML coders talk about it. But I can't think in another way to
do it. just my 5c

On Thu, Apr 9, 2009 at 1:29 PM, Jim Pinkham pinkh...@gmail.com wrote:
 A quick follow up in case anyone else was curious about how this is going:

 I ended up using ehcache page cache filter for a simple page that just
 displays 'current' items (calendar view of events) based on a db query.  No
 forms (state) on this page so it works pretty well.  In my DAO that does
 updates, I clear the cache.  Very simple, works great.  (Only catch I ran
 into is that my menus change when I have a session and I'm logged in as
 super-user, so I have to make sure I don't let that version of the page be
 cached - I do that by adding 'super' page parameter so URL is different
 and filter is set to only cache the 'normal' version.

 So, that still leaves me with my main catalog page, which is primarily a
 similar list of items, but it also has some active content (in particular, a
 search form).

 So my bright idea (tm)  (i.e. I'd love to hear critiques before I get too
 far along with it) is the following:

 Make a new page for just the data grid, with page parameters including the
 search string and last-modified date (and super-user login because I get
 some edit links and such with that).  Mount it and ehcache it, and override
 setHeader so it becomes client cache-able.   Then, my outer catalog page
 with the search form on it just uses an IFrame to display the grid data
 (easy to keep track of last-modified globally).  Same clear method in DAO
 dumps the cache whenever a change is made.

 Also, I'd want to make a robot no-follow thing to avoid google trap on that
 page.  Could this actually be a legitimate use of otherwise dodgy IFRAME ?

 Sound like a good plan?

 Thanks,
 -- Jim.

 On Fri, Mar 27, 2009 at 1:56 PM, Jim Pinkham pinkh...@gmail.com wrote:

 Jeremy,

 Thanks for your thoughtful reply - Scenario is exactly right.
 I played around with page headers to make the whole page cacheable, but ran
 into several problems - I have a search form, and there's an 'admin' login
 that enables edit links.  So it's really a stateful page, but I want to
 speed up the most common state.

 The bulk of the content is from an AjaxFallbackDefaultDataTable with
 sortable columns. I re-sorted a column with the Ajax Debug window open to
 measure it's data size - about 225000 chars.  My database search takes
 64ms.  Overall client repaint time is about 2 sec with browser on
 localhost.  I haven't found the right hook to measure total wicket response
 time yet, but it appears pretty quick - so that's why I thougth it made
 sense to focus on client caching.

 Before I give up entirely on this idea, I'm wondering if it might make
 sense to make the grid a public Resource, which I'm hoping the browser would
 treat like an image.  I can afford a separate db query to just get my
 max(lastModified), which might let me save the time to generate HTML, which
 looks as though it could be my bottleneck.  If this way is too hard, I'll
 give up, but it sounds do-able - what do you think?
 Thanks,
 -- Jim.


 On Thu, Mar 26, 2009 at 6:30 PM, Jeremy Thomerson 
 jer...@wickettraining.com wrote:

 How is this going to help you?  Scenario as I understand it:


   1. User requests homepage - pulls from site - with your etag in it
   2. User requests homepage again - calls site - your server does all of
   the loading of data - then you calculate / set etag
   3. Browser now knows that it is the same as before and does not have to
   pull the HTML down

 The user saves what is likely a very short time in the overall scheme of
 things - downloading the HTML.
 The user still has to sit through the process of you loading the data from
 the search / DB / etc. and generating HTML
 Your server saves no load - but a little bandwidth.

 I'd look at caching before it even gets to your server.  Otherwise your
 user
 will likely not see much benefit unless you are sending multiple MB of
 data
 back.  Sounds like premature optimization to me.

 --
 Jeremy Thomerson
 http://www.wickettraining.com



  On Thu, Mar 26, 2009 at 5:26 PM, Jim Pinkham pinkh...@gmail.com wrote:

  Thanks Jerry; I think that applies only to static pages.
 
  My next idea is to try overridding WebPage.setHeaders and just set the
 
  response.setHeader(Cache-Control, max-age=3600, must-revalidate);
 
  response.setHeader(ETag, 1);  // I'll use a checksum on the data
 coming
  back from my search (Even better would be a checksum on the rendered
 page
  data - any idea how to do that?)
  Initial test (above) seems promising...
 
  Thanks,
  -- Jim.
 
  On Thu, Mar 26, 2009 at 3:22 PM, Jeremy Thomerson 
  jer...@wickettraining.com
   wrote:
 
   Have you looked at a standard HTTP caching proxy like
   http://www.squid-cache.org/ ?
  
  
   --
   Jeremy Thomerson
   

Re: Can client cache pages effectively?

2009-03-27 Thread Jim Pinkham
Jeremy,

Thanks for your thoughtful reply - Scenario is exactly right.
I played around with page headers to make the whole page cacheable, but ran
into several problems - I have a search form, and there's an 'admin' login
that enables edit links.  So it's really a stateful page, but I want to
speed up the most common state.

The bulk of the content is from an AjaxFallbackDefaultDataTable with
sortable columns. I re-sorted a column with the Ajax Debug window open to
measure it's data size - about 225000 chars.  My database search takes
64ms.  Overall client repaint time is about 2 sec with browser on
localhost.  I haven't found the right hook to measure total wicket response
time yet, but it appears pretty quick - so that's why I thougth it made
sense to focus on client caching.

Before I give up entirely on this idea, I'm wondering if it might make sense
to make the grid a public Resource, which I'm hoping the browser would treat
like an image.  I can afford a separate db query to just get my
max(lastModified), which might let me save the time to generate HTML, which
looks as though it could be my bottleneck.  If this way is too hard, I'll
give up, but it sounds do-able - what do you think?
Thanks,
-- Jim.


On Thu, Mar 26, 2009 at 6:30 PM, Jeremy Thomerson jer...@wickettraining.com
 wrote:

 How is this going to help you?  Scenario as I understand it:


   1. User requests homepage - pulls from site - with your etag in it
   2. User requests homepage again - calls site - your server does all of
   the loading of data - then you calculate / set etag
   3. Browser now knows that it is the same as before and does not have to
   pull the HTML down

 The user saves what is likely a very short time in the overall scheme of
 things - downloading the HTML.
 The user still has to sit through the process of you loading the data from
 the search / DB / etc. and generating HTML
 Your server saves no load - but a little bandwidth.

 I'd look at caching before it even gets to your server.  Otherwise your
 user
 will likely not see much benefit unless you are sending multiple MB of data
 back.  Sounds like premature optimization to me.

 --
 Jeremy Thomerson
 http://www.wickettraining.com



  On Thu, Mar 26, 2009 at 5:26 PM, Jim Pinkham pinkh...@gmail.com wrote:

  Thanks Jerry; I think that applies only to static pages.
 
  My next idea is to try overridding WebPage.setHeaders and just set the
 
  response.setHeader(Cache-Control, max-age=3600, must-revalidate);
 
  response.setHeader(ETag, 1);  // I'll use a checksum on the data
 coming
  back from my search (Even better would be a checksum on the rendered page
  data - any idea how to do that?)
  Initial test (above) seems promising...
 
  Thanks,
  -- Jim.
 
  On Thu, Mar 26, 2009 at 3:22 PM, Jeremy Thomerson 
  jer...@wickettraining.com
   wrote:
 
   Have you looked at a standard HTTP caching proxy like
   http://www.squid-cache.org/ ?
  
  
   --
   Jeremy Thomerson
   http://www.wickettraining.com
  
  
  
   On Thu, Mar 26, 2009 at 2:02 PM, Jim Pinkham pinkh...@gmail.com
 wrote:
  
Changing my search query to this got some better hits:
http://lmgtfy.com/?q=cacheability
So, allow me to refine my question based on that - has anyone tried
  some
   of
these approaches (see first result from above) to generrate and dump
content
to a static file (renamed if it chages) and having the wicket home
 page
   be
a
redirect to that file, or something like that?
   
Thanks,
-- Jim.
On Thu, Mar 26, 2009 at 12:53 PM, Jim Pinkham pinkh...@gmail.com
   wrote:
   
 I've found a few posts about how to mark dynamic pages so they
 won't
  be
 cached.

 I've got a different situation that I think is fairly common - the
   'home'
 page of my app is effectively a (cheesr-like) catalog of items that
changes
 infrequently.   Users didn't like paging, so it's about 300 items
 in
  a
 simple scrollable page.  Once a user views it, they often drill
 down
   into
an
 item, then use the back button (or sometimes the Home link) to
   re-display
 it.

 The db query is actually pretty fast; I think the bottleneck seems
 to
   be
 fetching the HTML.

 My question is, can I use some kind of header caching hint with a
   version
 number so that once the content is identified as being the same as
 a
 previously fetched page, the user's browser will repaint it from a
   local
 cache?  (I know this is typically done with images, but I was
  wondering
if
 this would make sense to do also do with content that technically
'dynamic'
 but actually is 'fairly static' ?   (I say version number rather
 than
time
 to expire so that in case I add/change an item I can increment the
catalog
 version)

 Thanks,
 -- Jim

   
  
 



Can client cache pages effectively?

2009-03-26 Thread Jim Pinkham
I've found a few posts about how to mark dynamic pages so they won't be
cached.

I've got a different situation that I think is fairly common - the 'home'
page of my app is effectively a (cheesr-like) catalog of items that changes
infrequently.   Users didn't like paging, so it's about 300 items in a
simple scrollable page.  Once a user views it, they often drill down into an
item, then use the back button (or sometimes the Home link) to re-display
it.

The db query is actually pretty fast; I think the bottleneck seems to be
fetching the HTML.

My question is, can I use some kind of header caching hint with a version
number so that once the content is identified as being the same as a
previously fetched page, the user's browser will repaint it from a local
cache?  (I know this is typically done with images, but I was wondering if
this would make sense to do also do with content that technically 'dynamic'
but actually is 'fairly static' ?   (I say version number rather than time
to expire so that in case I add/change an item I can increment the catalog
version)

Thanks,
-- Jim


Re: Can client cache pages effectively?

2009-03-26 Thread Jim Pinkham
Changing my search query to this got some better hits:
http://lmgtfy.com/?q=cacheability
So, allow me to refine my question based on that - has anyone tried some of
these approaches (see first result from above) to generrate and dump content
to a static file (renamed if it chages) and having the wicket home page be a
redirect to that file, or something like that?

Thanks,
-- Jim.
On Thu, Mar 26, 2009 at 12:53 PM, Jim Pinkham pinkh...@gmail.com wrote:

 I've found a few posts about how to mark dynamic pages so they won't be
 cached.

 I've got a different situation that I think is fairly common - the 'home'
 page of my app is effectively a (cheesr-like) catalog of items that changes
 infrequently.   Users didn't like paging, so it's about 300 items in a
 simple scrollable page.  Once a user views it, they often drill down into an
 item, then use the back button (or sometimes the Home link) to re-display
 it.

 The db query is actually pretty fast; I think the bottleneck seems to be
 fetching the HTML.

 My question is, can I use some kind of header caching hint with a version
 number so that once the content is identified as being the same as a
 previously fetched page, the user's browser will repaint it from a local
 cache?  (I know this is typically done with images, but I was wondering if
 this would make sense to do also do with content that technically 'dynamic'
 but actually is 'fairly static' ?   (I say version number rather than time
 to expire so that in case I add/change an item I can increment the catalog
 version)

 Thanks,
 -- Jim



Re: Can client cache pages effectively?

2009-03-26 Thread Jeremy Thomerson
Have you looked at a standard HTTP caching proxy like
http://www.squid-cache.org/ ?


--
Jeremy Thomerson
http://www.wickettraining.com



On Thu, Mar 26, 2009 at 2:02 PM, Jim Pinkham pinkh...@gmail.com wrote:

 Changing my search query to this got some better hits:
 http://lmgtfy.com/?q=cacheability
 So, allow me to refine my question based on that - has anyone tried some of
 these approaches (see first result from above) to generrate and dump
 content
 to a static file (renamed if it chages) and having the wicket home page be
 a
 redirect to that file, or something like that?

 Thanks,
 -- Jim.
 On Thu, Mar 26, 2009 at 12:53 PM, Jim Pinkham pinkh...@gmail.com wrote:

  I've found a few posts about how to mark dynamic pages so they won't be
  cached.
 
  I've got a different situation that I think is fairly common - the 'home'
  page of my app is effectively a (cheesr-like) catalog of items that
 changes
  infrequently.   Users didn't like paging, so it's about 300 items in a
  simple scrollable page.  Once a user views it, they often drill down into
 an
  item, then use the back button (or sometimes the Home link) to re-display
  it.
 
  The db query is actually pretty fast; I think the bottleneck seems to be
  fetching the HTML.
 
  My question is, can I use some kind of header caching hint with a version
  number so that once the content is identified as being the same as a
  previously fetched page, the user's browser will repaint it from a local
  cache?  (I know this is typically done with images, but I was wondering
 if
  this would make sense to do also do with content that technically
 'dynamic'
  but actually is 'fairly static' ?   (I say version number rather than
 time
  to expire so that in case I add/change an item I can increment the
 catalog
  version)
 
  Thanks,
  -- Jim
 



Re: Can client cache pages effectively?

2009-03-26 Thread Jim Pinkham
Thanks Jerry; I think that applies only to static pages.

My next idea is to try overridding WebPage.setHeaders and just set the

response.setHeader(Cache-Control, max-age=3600, must-revalidate);

response.setHeader(ETag, 1);  // I'll use a checksum on the data coming
back from my search (Even better would be a checksum on the rendered page
data - any idea how to do that?)
Initial test (above) seems promising...

Thanks,
-- Jim.

On Thu, Mar 26, 2009 at 3:22 PM, Jeremy Thomerson jer...@wickettraining.com
 wrote:

 Have you looked at a standard HTTP caching proxy like
 http://www.squid-cache.org/ ?


 --
 Jeremy Thomerson
 http://www.wickettraining.com



 On Thu, Mar 26, 2009 at 2:02 PM, Jim Pinkham pinkh...@gmail.com wrote:

  Changing my search query to this got some better hits:
  http://lmgtfy.com/?q=cacheability
  So, allow me to refine my question based on that - has anyone tried some
 of
  these approaches (see first result from above) to generrate and dump
  content
  to a static file (renamed if it chages) and having the wicket home page
 be
  a
  redirect to that file, or something like that?
 
  Thanks,
  -- Jim.
  On Thu, Mar 26, 2009 at 12:53 PM, Jim Pinkham pinkh...@gmail.com
 wrote:
 
   I've found a few posts about how to mark dynamic pages so they won't be
   cached.
  
   I've got a different situation that I think is fairly common - the
 'home'
   page of my app is effectively a (cheesr-like) catalog of items that
  changes
   infrequently.   Users didn't like paging, so it's about 300 items in a
   simple scrollable page.  Once a user views it, they often drill down
 into
  an
   item, then use the back button (or sometimes the Home link) to
 re-display
   it.
  
   The db query is actually pretty fast; I think the bottleneck seems to
 be
   fetching the HTML.
  
   My question is, can I use some kind of header caching hint with a
 version
   number so that once the content is identified as being the same as a
   previously fetched page, the user's browser will repaint it from a
 local
   cache?  (I know this is typically done with images, but I was wondering
  if
   this would make sense to do also do with content that technically
  'dynamic'
   but actually is 'fairly static' ?   (I say version number rather than
  time
   to expire so that in case I add/change an item I can increment the
  catalog
   version)
  
   Thanks,
   -- Jim
  
 



Re: Can client cache pages effectively?

2009-03-26 Thread Jeremy Thomerson
How is this going to help you?  Scenario as I understand it:


   1. User requests homepage - pulls from site - with your etag in it
   2. User requests homepage again - calls site - your server does all of
   the loading of data - then you calculate / set etag
   3. Browser now knows that it is the same as before and does not have to
   pull the HTML down

The user saves what is likely a very short time in the overall scheme of
things - downloading the HTML.
The user still has to sit through the process of you loading the data from
the search / DB / etc. and generating HTML
Your server saves no load - but a little bandwidth.

I'd look at caching before it even gets to your server.  Otherwise your user
will likely not see much benefit unless you are sending multiple MB of data
back.  Sounds like premature optimization to me.

--
Jeremy Thomerson
http://www.wickettraining.com



On Thu, Mar 26, 2009 at 5:26 PM, Jim Pinkham pinkh...@gmail.com wrote:

 Thanks Jerry; I think that applies only to static pages.

 My next idea is to try overridding WebPage.setHeaders and just set the

 response.setHeader(Cache-Control, max-age=3600, must-revalidate);

 response.setHeader(ETag, 1);  // I'll use a checksum on the data coming
 back from my search (Even better would be a checksum on the rendered page
 data - any idea how to do that?)
 Initial test (above) seems promising...

 Thanks,
 -- Jim.

 On Thu, Mar 26, 2009 at 3:22 PM, Jeremy Thomerson 
 jer...@wickettraining.com
  wrote:

  Have you looked at a standard HTTP caching proxy like
  http://www.squid-cache.org/ ?
 
 
  --
  Jeremy Thomerson
  http://www.wickettraining.com
 
 
 
  On Thu, Mar 26, 2009 at 2:02 PM, Jim Pinkham pinkh...@gmail.com wrote:
 
   Changing my search query to this got some better hits:
   http://lmgtfy.com/?q=cacheability
   So, allow me to refine my question based on that - has anyone tried
 some
  of
   these approaches (see first result from above) to generrate and dump
   content
   to a static file (renamed if it chages) and having the wicket home page
  be
   a
   redirect to that file, or something like that?
  
   Thanks,
   -- Jim.
   On Thu, Mar 26, 2009 at 12:53 PM, Jim Pinkham pinkh...@gmail.com
  wrote:
  
I've found a few posts about how to mark dynamic pages so they won't
 be
cached.
   
I've got a different situation that I think is fairly common - the
  'home'
page of my app is effectively a (cheesr-like) catalog of items that
   changes
infrequently.   Users didn't like paging, so it's about 300 items in
 a
simple scrollable page.  Once a user views it, they often drill down
  into
   an
item, then use the back button (or sometimes the Home link) to
  re-display
it.
   
The db query is actually pretty fast; I think the bottleneck seems to
  be
fetching the HTML.
   
My question is, can I use some kind of header caching hint with a
  version
number so that once the content is identified as being the same as a
previously fetched page, the user's browser will repaint it from a
  local
cache?  (I know this is typically done with images, but I was
 wondering
   if
this would make sense to do also do with content that technically
   'dynamic'
but actually is 'fairly static' ?   (I say version number rather than
   time
to expire so that in case I add/change an item I can increment the
   catalog
version)
   
Thanks,
-- Jim