#2070: [patch] Large streaming uploads
----------------------------------------+-----------------------------------
   Reporter:  [EMAIL PROTECTED]  |                Owner:  jacob         
     Status:  assigned                  |            Component:  Core framework
    Version:                            |           Resolution:                
   Keywords:  streaming, upload, large  |                Stage:  Accepted      
  Has_patch:  1                         |           Needs_docs:  1             
Needs_tests:  0                         |   Needs_better_patch:  1             
----------------------------------------+-----------------------------------
Comment (by Ivan Sagalaev <[EMAIL PROTECTED]>):

 > The problem is that you do not know in advance how large a file is.
 However you know how large the POST is in total so you could enable file
 streaming based on that.
 
 Yeah that what I meant... The whole point is to not make Django eat
 everything in memory so a settings topping this amount seems most logical
 to me.
 
 I saw you made this setting. Here are couple of nits:
 
 > file_upload_min_size = getattr(settings, 'FILE_UPLOAD_MIN_SIZE', 100000)
 
 1. There is a convention in Django to put a setting with its default value
 into django/conf/global_settings.py. This keeps all defaults in one place
 **and** ensures that settings are always present so you don't have to
 fail-safe them with getattr, just use settings.SETTING_NAME
 
 2. FILE_UPLOAD_MIN_SIZE sounds confusing, like we don't allow users to
 upload small files :-). I still like an idea of calling it a buffer size
 somehow (and actually use it as a buffer size when reading chunks from
 input). What do you (and everyone) think?
 
 3. 100 KB is more like 100 * 1024. But I'd make it about 512 * 1024 or
 1024 * 1024. This is small enough to not even show on memory stats but big
 enough to handle most profile photos uploaded in practice without touching
 disk most of the times.
 
 > Yes, I did that to avoid too many changes in wsgi.py/modpython.py since
 you have to take care not to try to read the input stream again in case of
 an exception.
 
 Well... An exception ensures just this: the program won't run normally in
 case of errors, return values on the other hand are easy to forget to
 check. I see that you have removed swallowing the Exception. Why not
 MultiPartParserError then? As I understand this will happen only when the
 input is really malformed (like not created by a browser) so it's very
 little a user application can do about it. And I think it shouldn't.   I
 think we should catch this exception in handlers/base.py and
 unconditionally return '400 Bad Request' on it.
 
 > Aren't those methods invoked automatically for you when you do .save()?
 I didnt think they were public but I dont understand exacly how they work
 so I take your word for it...
 
 They are called from save(), yes. But they are also intended as a public
 API and documented (it's in model's API doc, too lazy to get a link :-) ).

-- 
Ticket URL: <http://code.djangoproject.com/ticket/2070#comment:101>
Django Code <http://code.djangoproject.com/>
The web framework for perfectionists with deadlines
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django updates" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-updates?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to