PHP Version 4.1.2 Red Hat Linux release 7.3 (Valhalla) (Kernel 2.4.18-3 on an i686) Apache/1.3.23
Rasmus Lerdorf wrote: > Which OS and which PHP version? > > On Fri, 4 Oct 2002, christian haines wrote: > > > this is what i have exactly in my code... > > > > header("Content-Type: application/force-download; name=\"$file\""); > > header("Content-Disposition: attachment; filename=\"$file \""); > > header("Content-Transfer-Encoding: binary"); > > header("Content-Length: $content_length"); > > readfile("$file_fullpath"); > > exit; > > > > it works for files upto 10M (same as memory limit) but not above. the paths to > > file and content length are correct as i have checked them and made comparisons to > > other files which i can download. > > > > is there a problem with this code? i have tried it with win ie 6 and mac ie 5.1.2 > > and ns 4.7.. same issue > > > > cheers > > christian > > > > Rasmus Lerdorf wrote: > > > > > readfile() reads 8k blocks at a time and dumps them out. It does not read > > > the entire thing into ram, so that wouldn't be what was causing you to hit > > > a memory limit. You must have done something else wrong then. > > > > > > -Rasmus > > > > > > On Fri, 4 Oct 2002, christian haines wrote: > > > > > > > thanks rasmus, > > > > > > > > i have tried read file but it gave me the same issues as fpassthru.. both cap > > > > on the memory_limit directive withint the php.ini file > > > > > > > > any other suggestions maybe? > > > > > > > > cheers > > > > christian > > > > > > > > Rasmus Lerdorf wrote: > > > > > > > > > readfile() > > > > > > > > > > On Fri, 4 Oct 2002, christian haines wrote: > > > > > > > > > > > hi all, > > > > > > > > > > > > i have successfully created a download script to force a user to > > > > > > download, however attempting to download large files causes an error > > > > > > saying that the file cannot be found. > > > > > > > > > > > > my code > > > > > > > header("Cache-control: private"); > > > > > > header("Content-Type: application/force-download; name=\"$file\""); > > > > > > header("Content-Disposition: attachment; filename=\"$file \""); > > > > > > header("Content-Transfer-Encoding: binary"); > > > > > > header("Content-Length: $content_length"); > > > > > > $fp = fopen($file_fullpath,"r"); > > > > > > fpassthru($fp); > > > > > > fclose($fp); > > > > > > < my code > > > > > > > > > > > > this is a memory issue in the php.ini i.e. memory_limit = 8M then the > > > > > > largest file i can download is 8M > > > > > > > > > > > > is there anyway to "force" a download without having to use the system > > > > > > hungry fpassthru function? > > > > > > > > > > > > this is driving me nuts so any help would be greatly appreciated > > > > > > > > > > > > cheers > > > > > > christian > > > > > > > > > > > > ps i read the following at php.net fpassthru man page but could not make > > > > > > sense of it (it appears to be some kind of solution) > > > > > > > > > > > > > "fpassthru() works best for small files. In download manager scripts, > > > > > > it's best to determine the URL of the file to download (you may generate > > > > > > it locally in your session data if you need so), and then use HTTP > > > > > > __temporary__ redirects (302 status code, with a "Location:" header > > > > > > specifying the effective download URL). > > > > > > This saves your web server from maintaining PHP scripts running for long > > > > > > times during the file downloadn and instead the download will be managed > > > > > > directly by the web server without scripting support (consequence: less > > > > > > memory resources used by parallel downloads)..." > > > > > > > > > > > > > > > > > > -- > > > > PHP General Mailing List (http://www.php.net/) > > > > To unsubscribe, visit: http://www.php.net/unsub.php > > > > > > > > > > -- > > PHP General Mailing List (http://www.php.net/) > > To unsubscribe, visit: http://www.php.net/unsub.php > > -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php