The uploads can be HUGE. The files being uploaded are the most obvious
examples, but any type of field can be maliciously made arbitrary large.

To protect the server I'd like to suggest the following additions:

        . ability to specify maximum size of each data field in the
          form. The submitted data should be checked and rejected with
          something like "413 Request Entity Too Long" (the message in
          the Apache's error log should explain how to raize the limit);
        . the default maximumg size should be small -- something like
          16 bytes for fields, 1Kb for files;
        . preferably, the limits should be changeable withing the page
          itself (in the first block of code, for example), but an
          Apache directive would do too -- and it will ecourage
          standartization of field-names within a site;
        . to handle data of unlimited size _as it arrives_ it should be
          possible to register TCL callbacks. For example, let's put
          the following into tovar.tcl:

                proc tovar { filename chunk } {
                        global UPLOAD
                        append UPLOAD(data,$filename) $chunk
                }
          and say:
          
                Dtcl_Script ChildInitScript "source tovar.tcl"
                Dtcl_UploadFilesToProc "tovar"

Yours,

        -mi


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to