ID: 40494 Updated by: [EMAIL PROTECTED] Reported By: foster dot graeme at gmail dot com Status: Bogus Bug Type: Zip Related Operating System: Linux PHP Version: 5.2.1 Assigned To: pajoye New Comment:
"I still think that it would be nice if there was some way for the system to manage this." It is in the TODO list. As I said three times already in this discussion. The solution is to add different modes: - commit at the end when the archive is close - immediate addition (will be much slower) And again, it is in my TODOs already. I cannot tell when they will be available (I do it on my free time). In the meantime a simple: if (($zip->numFiles % $yourlimit) == 0) {close; reopen;} will do it. "the archive can be partially built prior to the ulimit being reached. This could be set as 250, with the ability to overload it. Maybe this would only be triggered if a flag was set when the archive was opened." This solution does not work.The limit is arbitrary. There is no way to get an exact value (and I doubt php is the only running process). Previous Comments: ------------------------------------------------------------------------ [2007-02-15 14:02:51] foster dot graeme at gmail dot com Okay thanks for the explanation, I understand the problem a little better. I still think that it would be nice if there was some way for the system to manage this. I was thinking along the lines of a function to flush the files so that the archive can be partially built prior to the ulimit being reached. This could be set as 250, with the ability to overload it. Maybe this would only be triggered if a flag was set when the archive was opened. ------------------------------------------------------------------------ [2007-02-15 13:23:36] [EMAIL PROTECTED] See: http://pecl.php.net/bugs/bug.php?id=9443 "it would be good if this wasn't necessary, in thatthe code could catch the problem and allocate extra file handles if that is the problem." This is not something I can control. The operating system defines it and there is no way for me to increase this value. I suggest you to close and reopen it every 1000 or so (or even 255 if you want to go on the safest way, ie old windows). Future releases will have a different mode, where the checks will done only when you close the archives. ------------------------------------------------------------------------ [2007-02-15 13:14:57] foster dot graeme at gmail dot com Maybe I need to explain this problem a little more. I am trying to archive a folder on the server, at the moment it contains 5609 folders and 11,221 files. The script loops through the files adding them to the archive using the addFile() method. After the first 1002 files I get a ZIPARCHIVE::ER_OPEN. If I close the archive and the open it again I still have that error. However, if I close the archive and open it before I get that error then I can archive all 11,221 files. Since closing the file and re-opening fixes the problem (so long as I do that before I get the error) Then may I suggest that closing an archive will clear the status. Obviously, it would be good if this wasn't necessary, in thatthe code could catch the problem and allocate extra file handles if that is the problem. ------------------------------------------------------------------------ [2007-02-15 11:41:24] [EMAIL PROTECTED] "When adding files to an archive, (using successive ZipArchive::addFile() commands) the compression doesn't happen until the file is closed. " Yes, we do it while finalizing the archive. " This can result in an out of memory error, " You will run out of file ID before running out of memory. It does not really use many memory, only the file names and file handlers. I suppose you are talking about the file handlers? "It would certainly require a rewrite of the ugly function zip_close()" What is ugly in this function? Or do you have a portable way to lock a file until the archive creation is done? I think you refer to the file handlers limitation. There is already a bug about it and I plan to add a special (less safe) mode. This mode will allow one to add only the paths without checks, errors will occur only when the archive is closed. But that's a feature addition not a bug fix. I close this bug (not a bug > bogus). Thanks for your report! ------------------------------------------------------------------------ [2007-02-15 10:22:25] foster dot graeme at gmail dot com Description: ------------ When adding files to an archive, (using successive ZipArchive::addFile() commands) the compression doesn't happen until the file is closed. This can result in an out of memory error, a temporary fix is to close the archive and then reopen it within the php code. An idea solution would be to compress the file when it is added, probably in function _zip_replace(), but I don't know what the implications of this would be. It would certainly require a rewrite of the ugly function zip_close(). ------------------------------------------------------------------------ -- Edit this bug report at http://bugs.php.net/?id=40494&edit=1