How is 700MB too big for HTTP?  Ever download a linux distro?  Ever
benchmark FTP vs HTTP, the overhead is minimal... 

Kyle Smith
Unix Systems Administrator

-----Original Message-----
From: Nitsan Bin-Nun [] 
Sent: Thursday, April 16, 2009 3:37 PM
To: PHP General List
Subject: Re: [PHP] Need Your Help :) I'm Just About Creating File
Uploading Service

My bad, I'm sending a copy to the list.

On Thu, Apr 16, 2009 at 9:37 PM, Nitsan Bin-Nun <>

> Actually I don't much care for that, IMO 700MB~ is way too big for 
> HTTP, I thought of giving away FTP links for download, but I have 
> thought of the
> following:
> * First, there is a solution which will do this session validation/etc

> through .htaccess and will only rewrite it to the file instead of 
> sending it in chunks? because that if the server will have to send it 
> in chunks it will be a no-reason overkill for the CPU (calculating and

> reading these files..... overkill....).
> * Secondly, I thought of sending these 700MB~ through HTTP and giving 
> away FTP links for the people who bought this functionality, I don't 
> really care whether it works or not, as long as the website reputation
is still up.
> I also have just signed a contract with downloads website which has 
> 80k unique visitors/DAY!
> So I really have to think of scalability from the beginning of it, Do 
> you have any ideas/notes/anything that I should take care of or keep 
> in calculations when thinking of 80k crowd driving full speed on 
> towards my server every DAY??
> Thanks in Advance,
> Nitsan
> On Thu, Apr 16, 2009 at 9:27 PM, Michael A. Peters
>> Nitsan Bin-Nun wrote:
>>> Hi List,
>>> I have been thinking for a while about setting up my own 
>>> rapidshare.comclone, Few days back I went really serious with this 
>>> and came up with some ideas.
>>> This is where I need your help, my partner and I have been thinking 
>>> about the system that the website should run on.
>>> We came to conclusion that we are going to write it in PHP.
>>> There are several issues that came up during the mind-storm:
>>> First, how we can keep the files out of being published as direct
>>> My first idea was to host them one directory up from the http
>>> It seems good but how I would deliver the files to the users?
>> php wrapper.
>> It validates (session id or whatever) that the client has permission 
>> to access the file, and then sends the real file.
>> $archive = /path/to/some/tarball;
>> $tarname = "something.tar";
>> header("Pragma: public");
>> header("Expires: 0");
>> header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
>> header("Cache-Control: private",false);
>> header('Content-type: application/x-tar');
>> header("Content-Disposition: attachment; filename=" . $tarname);
>> header("Content-Transfer-Encoding: binary"); if ($fp = fopen( 
>> $archive , 'rb' )) {
>>   $sendOutput = "";
>>   while ($l = fgets($fp)) {
>>      $sendOutput .= $l;
>>      }
>>    $outputLen = strlen($sendOutput);
>>    header("Content-Length: $outputLen");
>>    print $sendOutput;
>>    } else {
>>    // for whatever reason we failed
>>    die();
>>    }
>>  We are talking about unlimited file-size hosting so that means that 
>> we
>>> will have to stream the files somehow... and they will be BIG (it's 
>>> defendant, about 700MB~ each)
>> Then I suggest setting up a torrent instead of direct download.
>> You can have protected torrents. I don't know how to set them up but
I use
>> them - there's a torrent site that requires I log in from the same IP
as I'm
>> running the torrent client from, for example.
>> If you want to provide service for those who can not use a torrent
>> use an ftp server to serve the files - so that ftp clients capable of
>> continuing an interrupted download can be used.
>> 700MB is really too big fot http. Sure, it works, but it is better to
>> a protocol designed for large binary files.

PHP General Mailing List (
To unsubscribe, visit:

Reply via email to