On 07/20/2010 04:30 PM, Roan Kattouw wrote:
> This does need client-side support, e.g. using the Firefogg extension
> for Firefox or a bot framework that knows about chunked uploads.

Requiring special client software is a problem. Is that really
the only possible solution?

I understand that a certain webserver or PHP configuration can
be a problem, in that it might receive the entire file in /tmp (that
might get full) before returning control to some upload.php script.
But I don't see why HTTP in itself would set a limit at 100 MB.
What decides this particular limit? Why isn't it 50 MB or 200 MB?

Some alternatives would be to open a separate anonymous FTP
upload ("requires special client software" -- from the 1980s,
still in use by the Internet Archive) or a get-from-URL
(server would download the file by HTTP GET from the user's
server at a specified URL).


-- 
   Lars Aronsson ([email protected])
   Aronsson Datateknik - http://aronsson.se



_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to