Hi,

On Monday, September 2, 2013 6:39:09 AM UTC+2, Curtis Maloney wrote:
>
> Whilst it's conceivable  some cache backend will have the smarts to 
> multiplex requests on a single connection, I suspect that's more the 
> exception than the case.
>

Agreed
  

> Obviously, the default would be one per thread.
>

This is what the pylibmc backend does already and memcached should do too; 
the db backend is per thread too since it uses Django's db connection. We 
might wanna supply a building block so that not everone has to implement 
their own variant of the threadlocal stuff :)  [Locmem and filebased 
shouldn't cause any issues either way and can stay as they are I think]

Of course, that could be simplified by just always creating a new instance 
> when more than just a name is provided. [or a "force new" keyword is 
> passed].
>

I wonder how common it is to supply more than just the name to get_cache 
(aside from tests maybe), I am +0 for making get_cache return "cached" 
instances for a single name and construct new ones for the other cases (+ a 
force keyword if someone really wants the old behavior).

Cheers,
Florian

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at http://groups.google.com/group/django-developers.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to