> I'd like to know what version of form.tcl you are using.  I tested
> quickly in an isolated environment (outside of aolserver), and
> feeding it about a 1MB file, it parses in about 1.4 secs.  This
> is replacing the nsv_ stuff with just array set calls and avoiding
> the data to disk copies.  This was on a dual PIII-550 (no threading
> in use).

# $Header: /cvsroot/aolserver/aolserver/tcl/form.tcl,v 1.3 2000/08/01 20:36:17 kriston 
Exp $

It's the one that came with aolserver 3.4, I think.

It doesn't have any nsv_ stuff in it, so maybe it is a different version
than you have.

The slowdown, on my machine at least, is definitely the parsing of the
content boundaries.  Even if only that command was moved to C, I think
there would be a significant improvement.  Handling of the Content-Length
would be good too since you could bypass searching for the boundary.  I
don't know if netscape/ie send the header when they do a post or not, but
custom tcl http clients like I use could be made to do it easily.


Oh, and I was doing my tests on a quite wimpy machine: Celeron 500 Mhz,
tons of RAM.  It serves me fine.  But if it takes this long to upload
files with multipart forms I'd rather just take another route.

Oh, and another comment: the number of megs in the file is unlikely to
be a deciding factor.  The reason I think this is because the parsing
procedure goes one *line* at a time.  In a binary file a line may be very
long, and a large file might have only 10% as many lines as similarly
large ascii file.  *my* test, now that I think about it, was using
/usr/dict/words, an extreme example because most lines are less than 20
characters long, making it have around 4 times as many lines as an average
text file.

Rusty

Reply via email to