The way Django ships, isn't it possible for a user to hijack the  
server by uploading HUUUGE files? Because the files are stored in  
memory, this seems like it could be a very bad thing.

There's a ticket, #2070, with a patch that buffers files in small  
chunks, so that no more than about 64k is ever in memory at the same  
time. But, if I'm reading the code correctly, it still uploads the  
whole file before the programmer has a chance to determine how big it  
is and whether or not to accept it. Get a bunch of people, all  
uploading 1GB nonsense, and I think you could find your hard drive  
very quickly full of stuff you didn't really want to be there in the  
first place.

Would it be better to expose the file-like object that comes with a  
file upload, rather than reading the file's whole content into memory  
(or into the server's file system, if the patch gets checked in)?  
It's easy to retain backward compatibility by just having a call to  
FILES['file_upload']['content'] simply call FILES['file_upload'] 
['file_like_object'].read(), but a developer could, instead, decide  
how large a file they're willing to allow someone to upload, upload  
that many bytes, and then raise an exception if the file is bigger,  
rather than waiting until the whole file is uploaded.

I must admit that the thing that got me started thinking about this  
was the ability to store binary data in the database rather than on  
the filesystem, but I'll start a separate thread to talk about what  
I'm thinking about there.

Todd

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers
-~----------~----~----~----~------~----~------~--~---

Reply via email to