I am using django-storages to store media files in Amazon S3.

Gunicorn spaws three worker processes, each running their own copy of 
S3BotoStorage. When I go into filebrowser and upload a file, the current 
worker process adds it to its entries list, and therefore knows that the 
file exists. When I go back to the Media Library, it switches to a 
different worker process and says the file doesn't exist.

Is there any way to get around this? Should django-storages share it's 
entries list across processes? Can it be run as a singleton? 

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/5e1a0cf8-2ece-43a5-8199-78cbcbbfbdb3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to