I'm moving into the XML space and one of the things I see is that XML
processing is very expensive, so AxKit, PageKit, et al make extensive
use of caching.  I'm keeping all of my data in a MySQL DB with about
40 tables.  I'm pretty clear about how to turn that MySQL data into
XML and turn the XML into HTML, WML, or what have you.  But I haven't
been able to wrap my skull around knowing when the data in Mysql is
fresher than what is in the cache without doing a major portion of the
work needed to generate that web page to begin with.

Do AxKit and PageKit pay such close attention to caching because XML
processing is so deadly slow that one doesn't have a hope of reasonable
response times on a fast but lightly loaded server otherwise?  Or is
it because even a fast server would quickly be on its knees under
anything more than a light load?

With a MVC type architecture, would it make sense to have the Model
objects maintain the XML related to the content I want to serve as
static files so that a simple stat of the appropriate XML file tells
me if my cached HTML document is out of date?

One more thing.  Perrin Harkins' eToys case study casually mentions a
a means of removing files from the mod_proxy cache directory so that
mod_proxy had to go back to the application servers to get an up to
date copy.  I haven't seen anything in the mod_proxy docs that says
this is possible.  Does something like that exist outside of eToys?

I don't know, maybe my Prussian Perfection gene has taken over again
and wants a bigger win than I need to get ...

--
Christopher L. Everett
Chief Technology Officer
The Medical Banner Exchange
Physicians Employment on the Internet

Reply via email to