I have a download script that streams the contents of multiple files
into a zip archive before passing it on the to browser to be
downloaded. The script uses file_get_contents() and gzdeflate() to
loop over multiple files to create the archive. Everything works
fine, except I have noticed that for a large number of files, this
script will exceed the php memory limit. I have used
memory_get_peak_usage() to narrow down the source of the high memory
usage and found it to be the two above methods. The methods are used
in a loop and the variable containing the file data is unset() and not
referenced in between calls. The script peak memory usage for the
script should be a function of the single largest file that is
included in the archive, but it seems to be the aggregate of all
files.
Here is the pseudo-code for this loop:
header( /* specify header to indicate download */ );
foreach( $files as $file )
{
echo zip_local_header_for($file);
$data = file_get_contents( $file )
$zdata = gzdeflate( $data );
unset($data);
unset($zdata);
}
echo zip_central_dir_for($files);
If I remove either the gzdeflate and replace the file_get_contents()
with a fread() based method, the script no longer experiences memory
problems.
Is this behavior as designed for these two functions (because PHP
scripts are usually short lived)? Is there a way to get them to
release memory? Is there something I'm missing? Thanks.
-- Ryan
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php