Hi all,

I am currently using an implementation of the django.files.storage API to
read/write files from Amazon S3 using the django storage API. This has
worked well except for in cases where one wants to read from one stream and
write to the storage API.

One case I am running into is that I would like to create an Upload handler
that skips putting files in tmp/ and uploads them directly to S3 as the
files could be quite large. I could do this by writing directly to S3 but I
would rather go through the storage API as I would like to change it out for
the local storage API when developing/running tests.

The storage API currently handles writing itself in the save method by
reading from a file like object, but you currently can't do the opposite
where you are given chunks of data and you want to pass them along.

Is there a particularly good way to solve this kind of problem?

Ian

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to