Sherlock, Ric wrote:
It seems to me (someone with no indepth knowledge of the area) that HTTP
in general must be able to work with files bigger than "a few megabytes"
given that many HTTP downloads of software are 50MB+. Having said that,
50MB should still fit in the memory of most servers, as long as many
people are not uploading at once.

Is there a difference between upload & download in this regard, or is it
just because downloads are 1 computer to many computers and uploads are
potentially many computers to 1?

I'm not familiar with JHP. AFAIK most http servers (at least apache) have a configurable upper limit on the size of post data per request. The default limit can be changed in its conf file. I guess this is needed for security reason. Http server may also receive data in-memory up to say 500kB then flush it to a temp file and continue receiving. After all data is received the server will send the content to cgi client via stdio/stdout. I guess this is the usual scenario that data is buffered before sending to cgi client.

--
regards,
bill
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to