Here are the headers passed by MSIE 5.5 one of my upload tests. Note that
Content-length is set. I expect most browsers do this properly with
multipart/form-data.

Accept: image/gif, image/x-xbitmap, image/jpeg, image/pjpeg,
application/vnd.ms-excel, application/vnd.ms-powerpoint, application/msword,
*/*
Referer: http://192.168.0.2:8000/file-upload-1.adp
Accept-Language: en-us
Content-Type: multipart/form-data;
boundary=---------------------------7d22ec28103c0240
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0)
Host: 192.168.0.2:8000
Content-Length: 4439129
Connection: Keep-Alive
Cache-Control: no-cache

/s.


-+-+-+-+-+-+-+-+-+-+-+-+-
If to err is human, I must be more human than most.

Scott S. Goodwin
u: http://scottg.net
e: [EMAIL PROTECTED]
p: 850.897.6830
aim: scottgnet


----- Original Message -----
From: "Rusty Brooks" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Monday, February 18, 2002 10:48 PM


> > I'd like to know what version of form.tcl you are using.  I tested
> > quickly in an isolated environment (outside of aolserver), and
> > feeding it about a 1MB file, it parses in about 1.4 secs.  This
> > is replacing the nsv_ stuff with just array set calls and avoiding
> > the data to disk copies.  This was on a dual PIII-550 (no threading
> > in use).
>
> # $Header: /cvsroot/aolserver/aolserver/tcl/form.tcl,v 1.3 2000/08/01
20:36:17 kriston Exp $
>
> It's the one that came with aolserver 3.4, I think.
>
> It doesn't have any nsv_ stuff in it, so maybe it is a different version
> than you have.
>
> The slowdown, on my machine at least, is definitely the parsing of the
> content boundaries.  Even if only that command was moved to C, I think
> there would be a significant improvement.  Handling of the Content-Length
> would be good too since you could bypass searching for the boundary.  I
> don't know if netscape/ie send the header when they do a post or not, but
> custom tcl http clients like I use could be made to do it easily.
>
>
> Oh, and I was doing my tests on a quite wimpy machine: Celeron 500 Mhz,
> tons of RAM.  It serves me fine.  But if it takes this long to upload
> files with multipart forms I'd rather just take another route.
>
> Oh, and another comment: the number of megs in the file is unlikely to
> be a deciding factor.  The reason I think this is because the parsing
> procedure goes one *line* at a time.  In a binary file a line may be very
> long, and a large file might have only 10% as many lines as similarly
> large ascii file.  *my* test, now that I think about it, was using
> /usr/dict/words, an extreme example because most lines are less than 20
> characters long, making it have around 4 times as many lines as an average
> text file.
>
> Rusty
>

Reply via email to