[PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Hi. I have searched a few of the mailing lists, and have not found an answer. I am working on a site that is currently running gforge ( http://gforge.org/ ). The process that is used to download files from the file repository is something like: Header('Content-disposition:

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Klaus Reimer
Robin Getz wrote: The issue is that readfile writes it to the output buffer before sending it to the client. Are you sure you HAVE output buffering? What does ob_get_level() return? If it returns 0 then you don't have output buffering. My theory (and it's only a theory) is, that readfile may

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Klaus Reimer [EMAIL PROTECTED] wrote: If this theory is true, you may try fpassthru(). replaced: readfile($name); with: $fp = fopen($name, 'rb'); fpassthru($fp); and now I don't loose 250 Meg of memory every time I download a 250Meg file. If someone wants to add this to the readfile() php

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Greg Donald
On Thu, 04 Nov 2004 08:22:18 -0800, Robin Getz [EMAIL PROTECTED] wrote: and now I don't loose 250 Meg of memory every time I download a 250Meg file. If someone wants to add this to the readfile() php manual - great. Anyone can post user comments in the manual. Give it a shot. -- Greg

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Curt Zirzow
* Thus wrote Robin Getz: Klaus Reimer [EMAIL PROTECTED] wrote: If this theory is true, you may try fpassthru(). replaced: readfile($name); with: $fp = fopen($name, 'rb'); fpassthru($fp); The only difference between readfile() and fpassthru() is what parameters you pass it.

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Curt Zirzow [EMAIL PROTECTED] wrote: replaced: readfile($name); with: $fp = fopen($name, 'rb'); fpassthru($fp); The only difference between readfile() and fpassthru() is what parameters you pass it. Something else is the problem, what version of php are you running? I am using php

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Klaus Reimer
Robin Getz wrote: The same problem exists with fpassthru (now that I have let it run a little longer) I now have 5 sleeping httpd processes on my system that are consuming 200Meg each. Any thoughts? Ok, so much for the theory. What about the output buffering? Have you checked if you have output

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
OK, I checked things out, and based on some private emails, and pointers, from Francisco M. Marzoa [EMAIL PROTECTED], I have now replaced: readfile($name); with: while(!feof($fp)) { $buf = fread($fp, 4096); echo $buf; $bytesSent+=strlen($buf);/* We know how

Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Curt Zirzow
* Thus wrote Robin Getz: Curt Zirzow [EMAIL PROTECTED] wrote: replaced: readfile($name); with: $fp = fopen($name, 'rb'); fpassthru($fp); The only difference between readfile() and fpassthru() is what parameters you pass it. Something else is the problem, what version of