Tim,

I think page-level caching would be a disaster.  Because every Lift page is
dependent on the session (e.g., sitemap/naviagation).

I would suggest caching subparts of the page, rather than the whole page.
You can see a pattern for caching in
lift-mapper/src/main/scala/net/liftweb/mapper/ProtoTag.scala  I'd also
suggest caching pre-parsed XML rather than Strings.

I have a number of plans for Lift during RC including caching loaded
resources when in Production mode and some significant optimizations of the
render pipeline (so that there will be far fewer actual rewrites of the page
during rendering.)  This should improve performance.

When there's limited database access (e.g., a user lookup on each page), you
should able to serve 500+ pages/second on a dual Opteron.

Thanks,

David

On Sun, Dec 7, 2008 at 3:47 PM, Tim Perrett <[EMAIL PROTECTED]> wrote:

>
> I've just come accross this:
>
> http://blog.cherouvim.com/caching-pages-using-ehcache/
>
> I think it should be just the job - perhaps I might need to extend it,
> but it should work no problem. However, one thing thats not working is
> that i keep getting a NPE because it cant find ehcache.xml on the
> classpath? When you do mvn jetty:run, where would one put ehcache.xml
> so it can be located? I've tried all sorts and done lots of googling
> but still cant find a decent answer.
>
> Apologies this is a little off-topic - its late now, and im wondering
> if there is something going on here im just missing?
>
> Cheers guys
>
> Tim
> >
>


-- 
Lift, the simply functional web framework http://liftweb.net
Collaborative Task Management http://much4.us
Follow me: http://twitter.com/dpp
Git some: http://github.com/dpp

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Lift" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/liftweb?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to