On Apr 20, 2004, at 12:38 PM, jbv wrote:


Don't what the max is, but last year I made an app that used to
upload elements of a webpage, including jpegs up to 150 Kb.
At the other side (on the server) was a MC cgi app whose task
was to receive the elements, save them in the right directories
and build the final webpage.

Don't know if that helps, although I suggest to break the content
into smaller chunks, and build a custom protocol to identify each
chunk while posting it, and then re-organize the whole data.
Furthermore, even if huge amounts of data could be posted (in theory),
I'm not sure that any server would hold the connexion until all data
are received...

JB

JB,


there's no need to custom protocol. see my previous post, http 1.1 supports chunk transfers out-of-the-spec. Server will keep listening while receiving the chunks. For sending files you really should use multipart data (okay, it's not called multipart data, but it's something near this, it's what your browser use to upload from form data... )

Cheers
Andre



--
Andre Alves Garzia  2004  BRAZIL
http://studio.soapdog.org

_______________________________________________
use-revolution mailing list
[EMAIL PROTECTED]
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to