+1 on hashing the key.
Just my two cents worth but, the way our company does it as standard is
something like this:
import hashlib
def generate_key(*args, **kwargs):
return hashlib.sha224(pickle.dumps([args, kwargs])).hexdigest()
Then you can do stuff like:
lol = YourModel.objects.all()[0]
generate_key('YourModel', lol.id, 'other stuff')
We actually have a replacement caching class which does all this stuff
behind the scenes that uses this approach, been doing this for almost a
year with not a single problem so far (although ours still uses md5, rather
than sha224)
Cal
On Thu, Nov 10, 2011 at 12:07 PM, Henrik Genssen
<[email protected]>wrote:
> Hi inonc,
>
> >Why am I the only one who seems to be faced with such issues?
> >I thought millions out there do use memcached as their backend...?
> >Wouldn't be MD5 a great solution to all of us? The guys with small
> >sites, don't have issues with scaling, and the guys with big sites are
> >in the need of a stable memcached backend.
> >
> >Thanks for your help
> >ionic
>
> this is a question I stopped asking my self.
> We inherited the cache-module and overwrote simply the problematic
> function, where the key are generated (also to support caching POST
> requests).
> I was tiered, discussing the need for this (e.g. for XMLRPC requests - as
> 2 + 2 might always be 4 :-)
>
> regards
>
> Henrik
>
> --
> You received this message because you are subscribed to the Google Groups
> "Django users" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to
> [email protected].
> For more options, visit this group at
> http://groups.google.com/group/django-users?hl=en.
>
>
--
You received this message because you are subscribed to the Google Groups
"Django users" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/django-users?hl=en.