I used to run a file hosting website. My advice would be to use a separate dedicated file server like nginx using x-accel-redirect http://kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/
<http://kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/>HTTPD should have something like this as well if you'd rather use that. This way you can just set an cfhttp header to serve your files. On Thu, Jun 17, 2010 at 1:39 PM, Halo Maps <[email protected]> wrote: > Greetings, > > I am hoping to pick some of your brains. I find myself in a quandary. If > you will indulge me I will explain the situation as briefly as I can. > > We current have a website, several really, that are fairly heavily > trafficked about 12 - 18 million pages per month primarily used to obtain > video game maps and assets for the games. The website is not the issue > since we store the actual files for download on remote servers and link > people to them for download via FTP. For the past 3 years this has been > working fairly well. We average over 15,000GB / Month in file traffic and > serve an average of 350,000 Files /month using this method. > > We have recently upgraded our remote servers to Athlon 64 X2 3400+, > DualCore > 64 Bit, 2x 1.8 GHz, 4 GB DDR2-RAM Win2008 X64 Web IIS7 with substantially > more bandwidth availability. Since it is a newer OS I was unable to use > the > old CF5 installation on the remote server and since only needing minimal > functionality installed Tomcat and Open BlueDragon for the CFML engine. > This seems to work as before since we only use cf on the remote servers to > check if the file exists before sending them to the ftp. > > In other applications I routinely use cfcontent to serve protected files on > extranet applications however the traffic ( 10-20 files/day) is nowhere > near > as rigorous as will be required here with 12,000 per day of 40Meg average > per file. I am considering serving the files now through cfcontent via > HTTP > instead of FTP for a couple of reasons. 1) because most Internet Security > programs block FTP and we have to help people (mostly kids) open the port > and 2) to prevent direct linking to the files because this endeavor is > funded by ads on the website (and my wallet). > > I know the most efficient way to serve this quantity and size of files is > via ftp but what I don't know is what is required by the various CF engines > AdobeCF, OpenBD, Bluedragon, Ralio to serve up the same via HTTP. Will our > new server hardware handle that kind of HTTP file traffic (I suspect so), > will OpenBD/Tomcat be up to the task or will I need a different CFML > engine? > Essentially what I need to know is what it would take to routinely serve > that many/size files through the CF engine. I don't want to go down this > road and find that people are having problems downloading because the CF > engine / Web server can't keep up. Has anyone had experience with this and > can you offer some advice? > > > Dennis Powers > UXB Internet - A Website Design & Hosting Company > P.O. Box 6028 > Wolcott, CT 06716 > 203-879-2844 > http://www.uxbinternet.com > > > > > -- > Open BlueDragon Public Mailing List > http://www.openbluedragon.org/ http://twitter.com/OpenBlueDragon > online manual: http://www.openbluedragon.org/manual/ > > mailing list - http://groups.google.com/group/openbd?hl=en > -- Open BlueDragon Public Mailing List http://www.openbluedragon.org/ http://twitter.com/OpenBlueDragon online manual: http://www.openbluedragon.org/manual/ mailing list - http://groups.google.com/group/openbd?hl=en
