How is 700MB too big for HTTP? Ever download a linux distro? Ever
benchmark FTP vs HTTP, the overhead is minimal...
I download linux distro's all the time - er, whenever a new CentOS is
It's not overhead that is the issue.
It's being able to continue an interrupted download that is an issue.
Some http clients may be able to, though I suspect that would require a
non standard extension to http that both client and server understand.
Also - many people use a temp filesystem (aka ram disk) for www
downloads (I doubt a majority but those of us who are smart) - where the
file is initially put until all the pieces have come down before it is
finally saved to the destination. Using tempfs for that requires less
disk IO because your www downloads are typically small, so no need to
write to disk until you have it all when it can do the write at once.
700MB can easily fill that temp filesystem, depending upon the size of
your tempfs (and what else it is being used for).
Serving via ftp - virtually every ftp client out there knows how to
resume an interrupted download, so it is much better suited for large
downloads than http. And serving via ftp/torrent - the user is far less
likely to be using a tempfs as a staging area that can result in a bad
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php