The code could be recreated from memory, I am sure I can get it freed but it isn't really all that magical. It needs some clean up but I will double check and post the file if I can.

On 10/4/06, Stuart Clarke <[EMAIL PROTECTED]> wrote:

On Tue, 2006-10-03 at 08:42 -0400, lateef jackson wrote:
> +1
> A project I worked on implemented some caching abstraction. There
> where basically 2 options Cache and TimedCache. Cache was an open
> ended cache where the caching system would evict objects as it needed
> the space. TimedCache was a cache that would expire after a specific
> amount of time. Generally we used  the TimedCache for web content that
> rarely changed. Note: objects that went into the open ended cache
> would also need to have code that invlidated those objects.
>
> We had 2 providers of back end caching memcache
> ( http://www.danga.com/memcached/) and lrcucache
> (http://freshmeat.net/projects/lrucache/). Simplified Dict interface.
> Only supports 'in', setitem, and getitem, delitem.
>
> memcached is really a wonderful piece of software but only if you need
> it. You can probably get away with a simpler caching system if you are
> only running 1 TG instance and don't need to share the cache with
> anything outside of TG.

*** AFAIK, memcached's interface isn't very complex at all.  It's a hash
of arbitrary data, with strings (or maybe any arbitrary data?) as the
keys.  For a single TG instance, this could be implemented as a simple
Python dictionary in the app server's RAM, probably global to all TG
threads, and therefore mutex-protected.

Is your code available for contribution, or is it locked up inside an
NDA somewhere?

Stuart




--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "TurboGears" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/turbogears
-~----------~----~----~----~------~----~------~--~---

Reply via email to