In my project I have a need to generate CSV output files for
download.  Some of these files aren't overly large - 1-20MB or so, but
they can get quite big (currently the largest are several hundred MB and
this could grow with time).

This would appear to be a case where there's no magic bullet, but I was
looking to see what folks thought the best way to handle this.  When I
first put this together a while back I knew very little of both python and
django and came up wtih what now seems to be a naive solution - in memory
I pulled the appropriate data, constructed a list of lists in memory (w/
for loops & list comprehensions, no generators involved), built a
HttpResponse object w/ the appropriate headers and then attached the
output of csv.writer() to it, and returned this mess.

As you might imagine this was both slow and a complete memory hog, and I'm
looking for better ways to do this.  If nothing else, it seems like making
use of generators should trade some of the speed for a heck of a lot of
memory and would probably be an overall better solution than storing
a list of hundreds of thousands of lists in memory.

One thought was to write out the file to a staging area, making use of
generators to ease that and then return a redirect to that file.

Another thought was that perhaps there's a better way to do it w/o
involving the middle man of an actual file on disk, but if so that's
beyond my current level of knowledge. 

Any ideas?

Thanks
-J


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to