Re: [PHP]Zip and text files generated are corrupted

2010-03-30 Thread Bastien Helders
I've come to realize something, but I'm not sure if I could be right:

Maybe the instructions are interrupted because there is a lack of virtual
memory. I mean is there not a limit to the memory the script can use? It
would explain why the script goes on, as when the instruction is
interrupted, all the memory taken by it is released.

I don't know if I was clear about what I wanted to say...

2010/3/29 Bastien Helders eldroskan...@gmail.com

 I'm not sure. What is the exact command you are using?

 I'll show the code for the two scenario, maybe it'll help. I've edited out
 the sensitive information though, but I kept the essence of how it works.

 1) Copy the previous file and make modification on it

 ?php
 //This is this command that got interrupted and thus create the unexpected
 end-of-archive
 //To note is that the $previous_patch is retrieved from another file server
 copy($previous_patch,$zipname);

 //I go up in the file system, so that build/patchname doesn't appear in the
 paths in the zip archive.
 chdir('build/'.$patchname);

 //Foreach new folder add it to the copied patch
 foreach($folders_added as $folder){
 $command = 'zip -gr ../../' . $zipname . '
 software/hfFolders/'.$folder.'/* 21';
 exec($command, $output, $status);
 //show output and status
 }
 //I go down again, as it is no more needed when deleting entry in a zip
 file
 chdir('../..');

 //Foreach folder to be removed, remove it
 foreach($folders_removed as $folder){
 $command = 'zip -d ' . $zipname . '
 software/hfFolders/'.$folder.'\* 21';
 exec($command, $output, $status);
 //show output and status
 }



 2)After all the needed files are gathered in a temporary folder, compress
 the all

 ?php
 //I go up in the file system, so that build/patchname doesn't appear in the
 paths in the zip archive.
 chdir('build/'.$patchname);
 $command = 'zip -r ../../' . $zipname . ' * 21';
 //This is the command that timeout in this case
 exec($command, $output, $status);
 //show output and status

 //Do the rest of the operations


 I wonder if the zipArchive route would be easier.

 That what I was using before, but it modifies the timestamp of the file
 that are already in the zip archive and I can't have that.


 According to the documentation, both Apache and IIS have similar
 timeout values ...
 
 Your web server can have other timeout configurations that may also
 interrupt PHP execution. Apache has a Timeout directive and IIS has a
 CGI timeout function. Both default to 300 seconds. See your web server
 documentation for specific details.
 (
 http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time
 )

 Yeah I found this config in the httpd-default.conf file of my apache
 installation, but as I determined using two consecutive call of microtime()
 that the interrupted instructions doesn't go farther as 200 seconds, I don't
 see it relevant... (and again after the instruction is interrupted, the
 script continue to run.)


 Can you run the command from the shell directly without any problems.
 And run it repeatedly.

 I take that the equivalent of the php copy() function is the Windows copy
 command line.
 In this case, both copy on the big archive and zip -r on a big gathering of
 folder are running in the shell without any problem and repeatedly.


 2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 15:20, Bastien Helders eldroskan...@gmail.com wrote:
   I have checked the rights on the file for the first scenario and no
 user as
  locked it, I can see it, read it and write into it. I could even delete
 it
  if I wanted.
 
  For the second scenario, it doesn't even apply, as the exec('zip') that
  timeout try to create a new file (naturally in a folder where the web
 app
  user has all the necessary rights)
 
  In both case, it is no PHP timeout, as after the copy() in the first
  scenario, and the exec('zip') in the second scenario, the script
 continue to
  execute the other instructions, although the manipulation of the big
 files
  fails.
 
  But if it is not a PHP timeout, what is it?
 
  2010/3/26 Richard Quadling rquadl...@googlemail.com
 
  On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com
 wrote:
   I already used error_reporting and set_time_limit and the use of
   ini_set('display_errors', 1); didn't display more exceptions.
  
   However the modification in the exec helped display STDERR I think.
  
   1) In the first scenario we have the following:
  
   STDERR
   zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
  
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   The funny thing is, that now it is throwing status 5: a severe error
 in
   the
   zipfile format was
   detected. Processing probably failed imme­diately. Why It throw a
   status 5
   instead of a status 14, I can't say.
  
   So that's using 'zip -gr', when I stop using the option g and then
 call
   exec('zip -r 

Re: [PHP]Zip and text files generated are corrupted

2010-03-29 Thread Bastien Helders
I'm not sure. What is the exact command you are using?

I'll show the code for the two scenario, maybe it'll help. I've edited out
the sensitive information though, but I kept the essence of how it works.

1) Copy the previous file and make modification on it

?php
//This is this command that got interrupted and thus create the unexpected
end-of-archive
//To note is that the $previous_patch is retrieved from another file server
copy($previous_patch,$zipname);

//I go up in the file system, so that build/patchname doesn't appear in the
paths in the zip archive.
chdir('build/'.$patchname);

//Foreach new folder add it to the copied patch
foreach($folders_added as $folder){
$command = 'zip -gr ../../' . $zipname . '
software/hfFolders/'.$folder.'/* 21';
exec($command, $output, $status);
//show output and status
}
//I go down again, as it is no more needed when deleting entry in a zip file
chdir('../..');

//Foreach folder to be removed, remove it
foreach($folders_removed as $folder){
$command = 'zip -d ' . $zipname . ' software/hfFolders/'.$folder.'\*
21';
exec($command, $output, $status);
//show output and status
}



2)After all the needed files are gathered in a temporary folder, compress
the all

?php
//I go up in the file system, so that build/patchname doesn't appear in the
paths in the zip archive.
chdir('build/'.$patchname);
$command = 'zip -r ../../' . $zipname . ' * 21';
//This is the command that timeout in this case
exec($command, $output, $status);
//show output and status

//Do the rest of the operations

I wonder if the zipArchive route would be easier.

That what I was using before, but it modifies the timestamp of the file that
are already in the zip archive and I can't have that.

According to the documentation, both Apache and IIS have similar
timeout values ...

Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
(
http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time)

Yeah I found this config in the httpd-default.conf file of my apache
installation, but as I determined using two consecutive call of microtime()
that the interrupted instructions doesn't go farther as 200 seconds, I don't
see it relevant... (and again after the instruction is interrupted, the
script continue to run.)

Can you run the command from the shell directly without any problems.
And run it repeatedly.

I take that the equivalent of the php copy() function is the Windows copy
command line.
In this case, both copy on the big archive and zip -r on a big gathering of
folder are running in the shell without any problem and repeatedly.

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 15:20, Bastien Helders eldroskan...@gmail.com wrote:
   I have checked the rights on the file for the first scenario and no user
 as
  locked it, I can see it, read it and write into it. I could even delete
 it
  if I wanted.
 
  For the second scenario, it doesn't even apply, as the exec('zip') that
  timeout try to create a new file (naturally in a folder where the web app
  user has all the necessary rights)
 
  In both case, it is no PHP timeout, as after the copy() in the first
  scenario, and the exec('zip') in the second scenario, the script continue
 to
  execute the other instructions, although the manipulation of the big
 files
  fails.
 
  But if it is not a PHP timeout, what is it?
 
  2010/3/26 Richard Quadling rquadl...@googlemail.com
 
  On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
   I already used error_reporting and set_time_limit and the use of
   ini_set('display_errors', 1); didn't display more exceptions.
  
   However the modification in the exec helped display STDERR I think.
  
   1) In the first scenario we have the following:
  
   STDERR
   zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
  
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   The funny thing is, that now it is throwing status 5: a severe error
 in
   the
   zipfile format was
   detected. Processing probably failed imme­diately. Why It throw a
   status 5
   instead of a status 14, I can't say.
  
   So that's using 'zip -gr', when I stop using the option g and then
 call
   exec('zip -r ...'), then I only get:
  
   STDERR
   zip error: Internal logic error (write error on zip file)
   /STDERR
  
   2) The error messages of the second scenario doesn't surprise me much:
  
   STDERR
   zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
   /STDERR
  
   Which was already known, as the call of copy() on the old patch P14
 crop
   it
   and thus prevent any operation to be done on it.
 
  So, the error is in the execution of the exec.
 
  Can you run the exec twice but to 2 different 

Re: [PHP]Zip and text files generated are corrupted

2010-03-27 Thread Kim Madsen

Mike Roberts wrote on 25/03/2010 14:56:

remove


No :-) Use the proper unsubscribe method rather than spamming the list.

--
Kind regards
Kim Emax - masterminds.dk

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
I've already specified the outputs, and it doesn't change if I put it in a
file.

1)In the first scenario, where all the data are compressed together, the
only call of exec('zip') give this output:

OUTPUT
adding: bin/ (stored 0%)
adding: bin/startHotFixInstaller.bat (deflated 41%)
adding: bin/startHotFixInstaller.sh (deflated 49%)
adding: software/ (stored 0%)
adding: software/hotfixes/ (stored 0%)
adding: software/hotfixes/hfFolder/ (stored 0%)
[snip]
adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/lib/julia.jar
(deflated 4%)
adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/software/
/OUTPUT

I snipped the output because it is a lot of the same, but, you'll notice
that in the last line, the status of the file between parenthesis is
missing, which leads me to think it has been interrupted.

I've made a few research in between.Of note, the status with which he
exited. Status 14 for the zip command means error writing to a file. But
it isn't always at the same files. Also, I upped the value of
max_input_time in php.ini from 60 to 600. Before the change the exec
instructions took about 60 seconds before interrupting, after it takes about
180-200 seconds and not 600 as expected.

2)In the second scenario, as said, I copy the previous patch (P14, which
itself is a behemoth of a zip archive that was manually assembled) and then
add and delete only a few folders, each calling the function exec('zip...').
Each time it ends with status 2, which means unexpected end of zip files.

And there is no output to each of those commands.

As for the single exec('zip..') in 1), the copy() of the previous patch took
about 60 seconds before the php.ini change and about 180-200 seconds after.
I take it that the copy() is interrupted thus explaining the unexpected end
of zip files (I can open the original patch P14 without any problem).

I hope I made myself more clear on the details of my problem.

Best Regards,
Bastien


2010/3/25 Richard Quadling rquadl...@googlemail.com

 On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
  I'm really stumped, it seems that although the script is running under
 the
  time limit, if a single instruction such as exec(zip) in the first
 case,
  or copy() in the second case are timing out, because it takes too much
 time
  processing the big file.
 
  Is there any configuration in php.ini (or anywhere else) that I could
 change
  to permit copy() or exec(zip) to run through without being interrupted?
 
  Regards,
  Bastien
 

 What is the output of the exec when the command fails?

 Not the return value of exec() which is the last line, but the whole
 thing, which is returned in the second parameter.

 If you can't see it due to pushing the file as part of the script,
 then try something like ...


 exec('zip ', $Output);
 file_put_contents('./ZipResults.txt', $Output);



 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Richard Quadling
On 26 March 2010 08:51, Bastien Helders eldroskan...@gmail.com wrote:
 I've already specified the outputs, and it doesn't change if I put it in a
 file.

 1)In the first scenario, where all the data are compressed together, the
 only call of exec('zip') give this output:

 OUTPUT
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 adding: software/hotfixes/hfFolder/ (stored 0%)
 [snip]
 adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/lib/julia.jar
 (deflated 4%)
 adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/software/
 /OUTPUT

 I snipped the output because it is a lot of the same, but, you'll notice
 that in the last line, the status of the file between parenthesis is
 missing, which leads me to think it has been interrupted.

 I've made a few research in between.Of note, the status with which he
 exited. Status 14 for the zip command means error writing to a file. But
 it isn't always at the same files. Also, I upped the value of
 max_input_time in php.ini from 60 to 600. Before the change the exec
 instructions took about 60 seconds before interrupting, after it takes about
 180-200 seconds and not 600 as expected.

 2)In the second scenario, as said, I copy the previous patch (P14, which
 itself is a behemoth of a zip archive that was manually assembled) and then
 add and delete only a few folders, each calling the function exec('zip...').
 Each time it ends with status 2, which means unexpected end of zip files.

 And there is no output to each of those commands.

 As for the single exec('zip..') in 1), the copy() of the previous patch took
 about 60 seconds before the php.ini change and about 180-200 seconds after.
 I take it that the copy() is interrupted thus explaining the unexpected end
 of zip files (I can open the original patch P14 without any problem).

 I hope I made myself more clear on the details of my problem.

 Best Regards,
 Bastien


 2010/3/25 Richard Quadling rquadl...@googlemail.com

 On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
  I'm really stumped, it seems that although the script is running under
  the
  time limit, if a single instruction such as exec(zip) in the first
  case,
  or copy() in the second case are timing out, because it takes too much
  time
  processing the big file.
 
  Is there any configuration in php.ini (or anywhere else) that I could
  change
  to permit copy() or exec(zip) to run through without being
  interrupted?
 
  Regards,
  Bastien
 

 What is the output of the exec when the command fails?

 Not the return value of exec() which is the last line, but the whole
 thing, which is returned in the second parameter.

 If you can't see it due to pushing the file as part of the script,
 then try something like ...


 exec('zip ', $Output);
 file_put_contents('./ZipResults.txt', $Output);



 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling



 --
 haXe - an open source web programming language
 http://haxe.org


I _think_ that the $Output will only hold STDOUT and not STDERR.

Can you try this ...

exec(zip  21, $Output);


Also,

error_reporting(-1); // Show ALL errors/warnings/notices.
ini_set('display_errors', 1); // Display them.
set_time_limit(0); // Allow run forever


-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
I already used error_reporting and set_time_limit and the use of
ini_set('display_errors', 1); didn't display more exceptions.

However the modification in the exec helped display STDERR I think.

1) In the first scenario we have the following:

STDERR
zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty

zip error: Internal logic error (write error on zip file)
/STDERR

The funny thing is, that now it is throwing status 5: a severe error in the
zipfile format was
detected. Processing probably failed imme­diately. Why It throw a status 5
instead of a status 14, I can't say.

So that's using 'zip -gr', when I stop using the option g and then call
exec('zip -r ...'), then I only get:

STDERR
zip error: Internal logic error (write error on zip file)
/STDERR

2) The error messages of the second scenario doesn't surprise me much:

STDERR
zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
/STDERR

Which was already known, as the call of copy() on the old patch P14 crop it
and thus prevent any operation to be done on it.

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 08:51, Bastien Helders eldroskan...@gmail.com wrote:
  I've already specified the outputs, and it doesn't change if I put it in
 a
  file.
 
  1)In the first scenario, where all the data are compressed together, the
  only call of exec('zip') give this output:
 
  OUTPUT
  adding: bin/ (stored 0%)
  adding: bin/startHotFixInstaller.bat (deflated 41%)
  adding: bin/startHotFixInstaller.sh (deflated 49%)
  adding: software/ (stored 0%)
  adding: software/hotfixes/ (stored 0%)
  adding: software/hotfixes/hfFolder/ (stored 0%)
  [snip]
  adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/lib/julia.jar
  (deflated 4%)
  adding: software/hotfixes/hfFolder/Patch-632Q3-033/Server/software/
  /OUTPUT
 
  I snipped the output because it is a lot of the same, but, you'll notice
  that in the last line, the status of the file between parenthesis is
  missing, which leads me to think it has been interrupted.
 
  I've made a few research in between.Of note, the status with which he
  exited. Status 14 for the zip command means error writing to a file.
 But
  it isn't always at the same files. Also, I upped the value of
  max_input_time in php.ini from 60 to 600. Before the change the exec
  instructions took about 60 seconds before interrupting, after it takes
 about
  180-200 seconds and not 600 as expected.
 
  2)In the second scenario, as said, I copy the previous patch (P14, which
  itself is a behemoth of a zip archive that was manually assembled) and
 then
  add and delete only a few folders, each calling the function
 exec('zip...').
  Each time it ends with status 2, which means unexpected end of zip
 files.
 
  And there is no output to each of those commands.
 
  As for the single exec('zip..') in 1), the copy() of the previous patch
 took
  about 60 seconds before the php.ini change and about 180-200 seconds
 after.
  I take it that the copy() is interrupted thus explaining the unexpected
 end
  of zip files (I can open the original patch P14 without any problem).
 
  I hope I made myself more clear on the details of my problem.
 
  Best Regards,
  Bastien
 
 
  2010/3/25 Richard Quadling rquadl...@googlemail.com
 
  On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
   I'm really stumped, it seems that although the script is running under
   the
   time limit, if a single instruction such as exec(zip) in the first
   case,
   or copy() in the second case are timing out, because it takes too much
   time
   processing the big file.
  
   Is there any configuration in php.ini (or anywhere else) that I could
   change
   to permit copy() or exec(zip) to run through without being
   interrupted?
  
   Regards,
   Bastien
  
 
  What is the output of the exec when the command fails?
 
  Not the return value of exec() which is the last line, but the whole
  thing, which is returned in the second parameter.
 
  If you can't see it due to pushing the file as part of the script,
  then try something like ...
 
 
  exec('zip ', $Output);
  file_put_contents('./ZipResults.txt', $Output);
 
 
 
  --
  -
  Richard Quadling
  Standing on the shoulders of some very clever giants!
  EE : http://www.experts-exchange.com/M_248814.html
  EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
  Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
  ZOPA : http://uk.zopa.com/member/RQuadling
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 

 I _think_ that the $Output will only hold STDOUT and not STDERR.

 Can you try this ...

 exec(zip  21, $Output);


 Also,

 error_reporting(-1); // Show ALL errors/warnings/notices.
 ini_set('display_errors', 1); // Display them.
 set_time_limit(0); // Allow run forever


 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : 

Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Richard Quadling
On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
 I already used error_reporting and set_time_limit and the use of
 ini_set('display_errors', 1); didn't display more exceptions.

 However the modification in the exec helped display STDERR I think.

 1) In the first scenario we have the following:

 STDERR
 zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty

 zip error: Internal logic error (write error on zip file)
 /STDERR

 The funny thing is, that now it is throwing status 5: a severe error in the
 zipfile format was
 detected. Processing probably failed imme­diately. Why It throw a status 5
 instead of a status 14, I can't say.

 So that's using 'zip -gr', when I stop using the option g and then call
 exec('zip -r ...'), then I only get:

 STDERR
 zip error: Internal logic error (write error on zip file)
 /STDERR

 2) The error messages of the second scenario doesn't surprise me much:

 STDERR
 zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
 /STDERR

 Which was already known, as the call of copy() on the old patch P14 crop it
 and thus prevent any operation to be done on it.

So, the error is in the execution of the exec.

Can you run the exec twice but to 2 different zip files.

If the issue is that PHP is timing out, then the first error COULD be
due to the process being killed and if so, the second one won't start.

But if the second one starts, then that pretty much rules out PHP timeouts.

I assume you've checked disk space and read access to the files in
question? i.e. they aren't locked by another user?


-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Bastien Helders
 I have checked the rights on the file for the first scenario and no user as
locked it, I can see it, read it and write into it. I could even delete it
if I wanted.

For the second scenario, it doesn't even apply, as the exec('zip') that
timeout try to create a new file (naturally in a folder where the web app
user has all the necessary rights)

In both case, it is no PHP timeout, as after the copy() in the first
scenario, and the exec('zip') in the second scenario, the script continue to
execute the other instructions, although the manipulation of the big files
fails.

But if it is not a PHP timeout, what is it?

2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
  I already used error_reporting and set_time_limit and the use of
  ini_set('display_errors', 1); didn't display more exceptions.
 
  However the modification in the exec helped display STDERR I think.
 
  1) In the first scenario we have the following:
 
  STDERR
  zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
 
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  The funny thing is, that now it is throwing status 5: a severe error in
 the
  zipfile format was
  detected. Processing probably failed imme­diately. Why It throw a status
 5
  instead of a status 14, I can't say.
 
  So that's using 'zip -gr', when I stop using the option g and then call
  exec('zip -r ...'), then I only get:
 
  STDERR
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  2) The error messages of the second scenario doesn't surprise me much:
 
  STDERR
  zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
  /STDERR
 
  Which was already known, as the call of copy() on the old patch P14 crop
 it
  and thus prevent any operation to be done on it.

 So, the error is in the execution of the exec.

 Can you run the exec twice but to 2 different zip files.

 If the issue is that PHP is timing out, then the first error COULD be
 due to the process being killed and if so, the second one won't start.

 But if the second one starts, then that pretty much rules out PHP timeouts.

 I assume you've checked disk space and read access to the files in
 question? i.e. they aren't locked by another user?


 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-26 Thread Richard Quadling
On 26 March 2010 15:20, Bastien Helders eldroskan...@gmail.com wrote:
  I have checked the rights on the file for the first scenario and no user as
 locked it, I can see it, read it and write into it. I could even delete it
 if I wanted.

 For the second scenario, it doesn't even apply, as the exec('zip') that
 timeout try to create a new file (naturally in a folder where the web app
 user has all the necessary rights)

 In both case, it is no PHP timeout, as after the copy() in the first
 scenario, and the exec('zip') in the second scenario, the script continue to
 execute the other instructions, although the manipulation of the big files
 fails.

 But if it is not a PHP timeout, what is it?

 2010/3/26 Richard Quadling rquadl...@googlemail.com

 On 26 March 2010 12:21, Bastien Helders eldroskan...@gmail.com wrote:
  I already used error_reporting and set_time_limit and the use of
  ini_set('display_errors', 1); didn't display more exceptions.
 
  However the modification in the exec helped display STDERR I think.
 
  1) In the first scenario we have the following:
 
  STDERR
  zip warning: ../../build/Patch-6-3-2_Q3P15.zip not found or empty
 
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  The funny thing is, that now it is throwing status 5: a severe error in
  the
  zipfile format was
  detected. Processing probably failed imme­diately. Why It throw a
  status 5
  instead of a status 14, I can't say.
 
  So that's using 'zip -gr', when I stop using the option g and then call
  exec('zip -r ...'), then I only get:
 
  STDERR
  zip error: Internal logic error (write error on zip file)
  /STDERR
 
  2) The error messages of the second scenario doesn't surprise me much:
 
  STDERR
  zip error: Unexpected end of zip file (build/Patch-6-3-2_Q3P15.zip)
  /STDERR
 
  Which was already known, as the call of copy() on the old patch P14 crop
  it
  and thus prevent any operation to be done on it.

 So, the error is in the execution of the exec.

 Can you run the exec twice but to 2 different zip files.

 If the issue is that PHP is timing out, then the first error COULD be
 due to the process being killed and if so, the second one won't start.

 But if the second one starts, then that pretty much rules out PHP
 timeouts.

 I assume you've checked disk space and read access to the files in
 question? i.e. they aren't locked by another user?


 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling



 --
 haXe - an open source web programming language
 http://haxe.org


I'm not sure. What is the exact command you are using?

I wonder if the zipArchive route would be easier.


According to the documentation, both Apache and IIS have similar
timeout values ...

Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
(http://docs.php.net/manual/en/info.configuration.php#ini.max-execution-time)

Can you run the command from the shell directly without any problems.
And run it repeatedly.


-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
So I tested two scenario:

- First, I gather all the files selected for the patch and then compress
them together and here is what is displayed:

[Begin display]
The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status of
14 and the following output:
adding: bin/ (stored 0%)
adding: bin/startHotFixInstaller.bat (deflated 41%)
adding: bin/startHotFixInstaller.sh (deflated 49%)
adding: software/ (stored 0%)
adding: software/hotfixes/ (stored 0%)
[snip]

br

bWarning/b:
rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
[function.rename]: No such file or directory
[End display]

I know that the rename didn't work, while the zip command aborted and
generated no zip file.
There is no problem with the README text file.

- Second scenario, I take the previous patch, compare the list of folders in
the previous patch with list of selected folder, add the folders not in the
previous patch and eventually remove folders that weren't selected but were
in the previous patch.

In this case, all the commands, may it be of the type
zip -gr ../../build/Patch-6-3-2_Q3P15.zip
software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
delete an unwanted folder returns all with status 2 and no output.

2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);),
 and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
Forgot to say, it is the second scenario that generate corrupted zip and
text files with unexpected end of files.

2010/3/25 Bastien Helders eldroskan...@gmail.com

 So I tested two scenario:

 - First, I gather all the files selected for the patch and then compress
 them together and here is what is displayed:

 [Begin display]
 The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status
 of 14 and the following output:
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 [snip]

 br

 bWarning/b:
 rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
 [function.rename]: No such file or directory
 [End display]

 I know that the rename didn't work, while the zip command aborted and
 generated no zip file.
 There is no problem with the README text file.

 - Second scenario, I take the previous patch, compare the list of folders
 in the previous patch with list of selected folder, add the folders not in
 the previous patch and eventually remove folders that weren't selected but
 were in the previous patch.

 In this case, all the commands, may it be of the type
 zip -gr ../../build/Patch-6-3-2_Q3P15.zip
 software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
 build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
 delete an unwanted folder returns all with status 2 and no output.

 2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);),
 and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the
 php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has
 an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




 --
 haXe - an open source web programming language
 http://haxe.org




-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Bastien Helders
I'm really stumped, it seems that although the script is running under the
time limit, if a single instruction such as exec(zip) in the first case,
or copy() in the second case are timing out, because it takes too much time
processing the big file.

Is there any configuration in php.ini (or anywhere else) that I could change
to permit copy() or exec(zip) to run through without being interrupted?

Regards,
Bastien

2010/3/25 Bastien Helders eldroskan...@gmail.com

 Forgot to say, it is the second scenario that generate corrupted zip and
 text files with unexpected end of files.

 2010/3/25 Bastien Helders eldroskan...@gmail.com

 So I tested two scenario:

 - First, I gather all the files selected for the patch and then compress
 them together and here is what is displayed:

 [Begin display]
 The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a status
 of 14 and the following output:
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 [snip]

 br

 bWarning/b:
 rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15/Patch-6-3-2_Q3P15.zip)
 [function.rename]: No such file or directory
 [End display]

 I know that the rename didn't work, while the zip command aborted and
 generated no zip file.
 There is no problem with the README text file.

 - Second scenario, I take the previous patch, compare the list of folders
 in the previous patch with list of selected folder, add the folders not in
 the previous patch and eventually remove folders that weren't selected but
 were in the previous patch.

 In this case, all the commands, may it be of the type
 zip -gr ../../build/Patch-6-3-2_Q3P15.zip
 software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
 build/Patch-6-3-2_Q3P15.zip software/hotfixes/hfFolder/HF-632Q3-127\* to
 delete an unwanted folder returns all with status 2 and no output.

 2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
  Hi Ashley,
 
  No, I set the time limit high enough
 (set_time_limit(2*HOUR+8*MINUTE);), and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the
 php.ini
  file to a higher value help? Actually it is set to 128M. But I actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders (with
 content)
  want to create a zip containing them as well as creating a text file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and text
 file
  without any problem, but as soon as the number of selected folder has
 an
  high value (let's say about 150 of them), I've got problems with the
  generated files: The zip archive doesn't contain all the folders and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the next
  operation and close the streams uncleanly. But I can't be sure about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

 sort of thing.

 The error may be in the zip.
 --
 -
 Richard Quadling
 Standing on the shoulders of some very clever giants!
 EE : http://www.experts-exchange.com/M_248814.html
 EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
 Zend Certified Engineer :
 http://zend.com/zce.php?c=ZEND002498r=213474731
 ZOPA : http://uk.zopa.com/member/RQuadling




 --
 haXe - an open source web programming language
 http://haxe.org




 --
 haXe - an open source web programming language
 http://haxe.org




-- 
haXe - an open source web programming language
http://haxe.org


RE: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Mike Roberts
remove





 Sincerely,

 Michael Roberts
Executive Recruiter
 Corporate Staffing Services
 150 Monument Road, Suite 510
 Bala Cynwyd, PA 19004
 P 610-771-1084
 F 610-771-0390
 E mrobe...@jobscss.com
Check out my recent feature article in Professional Surveyor 12/09
edition. 
http://www.profsurv.com/magazine/article.aspx?i=70379






-Original Message-
From: Bastien Helders [mailto:eldroskan...@gmail.com] 
Sent: Thursday, March 25, 2010 9:32 AM
To: rquadl...@googlemail.com
Cc: a...@ashleysheridan.co.uk; php-general@lists.php.net
Subject: Re: [PHP]Zip and text files generated are corrupted

I'm really stumped, it seems that although the script is running under
the
time limit, if a single instruction such as exec(zip) in the first
case,
or copy() in the second case are timing out, because it takes too much
time
processing the big file.

Is there any configuration in php.ini (or anywhere else) that I could
change
to permit copy() or exec(zip) to run through without being
interrupted?

Regards,
Bastien

2010/3/25 Bastien Helders eldroskan...@gmail.com

 Forgot to say, it is the second scenario that generate corrupted zip
and
 text files with unexpected end of files.

 2010/3/25 Bastien Helders eldroskan...@gmail.com

 So I tested two scenario:

 - First, I gather all the files selected for the patch and then
compress
 them together and here is what is displayed:

 [Begin display]
 The command zip -gr ../../build/Patch-6-3-2_Q3P15.zip * returned a
status
 of 14 and the following output:
 adding: bin/ (stored 0%)
 adding: bin/startHotFixInstaller.bat (deflated 41%)
 adding: bin/startHotFixInstaller.sh (deflated 49%)
 adding: software/ (stored 0%)
 adding: software/hotfixes/ (stored 0%)
 [snip]

 br

 bWarning/b:

rename(build/Patch-6-3-2_Q3P15.zip,P:/Path_For_Deposit/Patch-6-3-2_Q3P15
/Patch-6-3-2_Q3P15.zip)
 [function.rename]: No such file or directory
 [End display]

 I know that the rename didn't work, while the zip command aborted and
 generated no zip file.
 There is no problem with the README text file.

 - Second scenario, I take the previous patch, compare the list of
folders
 in the previous patch with list of selected folder, add the folders
not in
 the previous patch and eventually remove folders that weren't
selected but
 were in the previous patch.

 In this case, all the commands, may it be of the type
 zip -gr ../../build/Patch-6-3-2_Q3P15.zip
 software/hotfixes/hfFolder/HF-632Q3-152/* to add a folder or zip -d
 build/Patch-6-3-2_Q3P15.zip
software/hotfixes/hfFolder/HF-632Q3-127\* to
 delete an unwanted folder returns all with status 2 and no output.

 2010/3/24 Richard Quadling rquadl...@googlemail.com

 On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com
wrote:
  Hi Ashley,
 
  No, I set the time limit high enough
 (set_time_limit(2*HOUR+8*MINUTE);), and
  the execution stops a long time before the time limit is reached.
 
  It might be relevent that the web application is hosted on a
Windows
  Machine.
 
  I asked myself, would setting the parameter memory_limit of the
 php.ini
  file to a higher value help? Actually it is set to 128M. But I
actually
  don't have problems with creating a zip archive of about 250M (~80
 folders),
  it actually occurs with 3 times bigger archives.
 
  Best Regards,
  Bastien
 
  2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk
 
   On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:
 
  Hi list,
 
  I've got this web app, which from a list of selected folders
(with
 content)
  want to create a zip containing them as well as creating a text
file
 with
  information about the chosen folders and how to use them.
 
  To create the zip file I use exec('zip -gr ' .$zipname.' * 
 mylog.log');
  in the temporary folder where I gathered all the data (using a
 zipArchive
  object was more time consuming). I then create the text file
using
 fopen,
  many fwrites and a fclose.
 
  My problem is the following, normally it creates the archive and
text
 file
  without any problem, but as soon as the number of selected folder
has
 an
  high value (let's say about 150 of them), I've got problems with
the
  generated files: The zip archive doesn't contain all the folders
and
 there
  is an unexpected end of file on both zip and text files.
 
  My guess is, as it takes too much time, the script goes on to the
next
  operation and close the streams uncleanly. But I can't be sure
about
 that,
  and I don't know where to investigate.
 
  Regards,
  Bastien
 
 
  Is the script maybe running past the max_execution_time before
the zip
  files are completed?
 
 
Thanks,
  Ash
  http://www.ashleysheridan.co.uk
 
 
 
 
 
  --
  haXe - an open source web programming language
  http://haxe.org
 


 Make sure you have ...

 error_reporting(-1); // show ALL errors/warnings/notices/etc.

 and ...

 exec($Command, $Output, $Status); // Capture the output.
 echo The $Command returned a status of $Status and the following
 output:, PHP_EOL, implode(PHP_EOL, $Output

Re: [PHP]Zip and text files generated are corrupted

2010-03-25 Thread Richard Quadling
On 25 March 2010 13:31, Bastien Helders eldroskan...@gmail.com wrote:
 I'm really stumped, it seems that although the script is running under the
 time limit, if a single instruction such as exec(zip) in the first case,
 or copy() in the second case are timing out, because it takes too much time
 processing the big file.

 Is there any configuration in php.ini (or anywhere else) that I could change
 to permit copy() or exec(zip) to run through without being interrupted?

 Regards,
 Bastien


What is the output of the exec when the command fails?

Not the return value of exec() which is the last line, but the whole
thing, which is returned in the second parameter.

If you can't see it due to pushing the file as part of the script,
then try something like ...


exec('zip ', $Output);
file_put_contents('./ZipResults.txt', $Output);



-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Ashley Sheridan
On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,
 
 I've got this web app, which from a list of selected folders (with content)
 want to create a zip containing them as well as creating a text file with
 information about the chosen folders and how to use them.
 
 To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
 in the temporary folder where I gathered all the data (using a zipArchive
 object was more time consuming). I then create the text file using fopen,
 many fwrites and a fclose.
 
 My problem is the following, normally it creates the archive and text file
 without any problem, but as soon as the number of selected folder has an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and there
 is an unexpected end of file on both zip and text files.
 
 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about that,
 and I don't know where to investigate.
 
 Regards,
 Bastien


Is the script maybe running past the max_execution_time before the zip
files are completed?

Thanks,
Ash
http://www.ashleysheridan.co.uk




Re: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Bastien Helders
Hi Ashley,

No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);), and
the execution stops a long time before the time limit is reached.

It might be relevent that the web application is hosted on a Windows
Machine.

I asked myself, would setting the parameter memory_limit of the php.ini
file to a higher value help? Actually it is set to 128M. But I actually
don't have problems with creating a zip archive of about 250M (~80 folders),
it actually occurs with 3 times bigger archives.

Best Regards,
Bastien

2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk

  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,

 I've got this web app, which from a list of selected folders (with content)
 want to create a zip containing them as well as creating a text file with
 information about the chosen folders and how to use them.

 To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
 in the temporary folder where I gathered all the data (using a zipArchive
 object was more time consuming). I then create the text file using fopen,
 many fwrites and a fclose.

 My problem is the following, normally it creates the archive and text file
 without any problem, but as soon as the number of selected folder has an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and there
 is an unexpected end of file on both zip and text files.

 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about that,
 and I don't know where to investigate.

 Regards,
 Bastien


 Is the script maybe running past the max_execution_time before the zip
 files are completed?


   Thanks,
 Ash
 http://www.ashleysheridan.co.uk





-- 
haXe - an open source web programming language
http://haxe.org


Re: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Richard Quadling
On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
 Hi Ashley,

 No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);), and
 the execution stops a long time before the time limit is reached.

 It might be relevent that the web application is hosted on a Windows
 Machine.

 I asked myself, would setting the parameter memory_limit of the php.ini
 file to a higher value help? Actually it is set to 128M. But I actually
 don't have problems with creating a zip archive of about 250M (~80 folders),
 it actually occurs with 3 times bigger archives.

 Best Regards,
 Bastien

 2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk

  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,

 I've got this web app, which from a list of selected folders (with content)
 want to create a zip containing them as well as creating a text file with
 information about the chosen folders and how to use them.

 To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
 in the temporary folder where I gathered all the data (using a zipArchive
 object was more time consuming). I then create the text file using fopen,
 many fwrites and a fclose.

 My problem is the following, normally it creates the archive and text file
 without any problem, but as soon as the number of selected folder has an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and there
 is an unexpected end of file on both zip and text files.

 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about that,
 and I don't know where to investigate.

 Regards,
 Bastien


 Is the script maybe running past the max_execution_time before the zip
 files are completed?


   Thanks,
 Ash
 http://www.ashleysheridan.co.uk





 --
 haXe - an open source web programming language
 http://haxe.org



Make sure you have ...

error_reporting(-1); // show ALL errors/warnings/notices/etc.

and ...

exec($Command, $Output, $Status); // Capture the output.
echo The $Command returned a status of $Status and the following
output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

sort of thing.

The error may be in the zip.
-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Mike Roberts
Remove





 Sincerely,

 Michael Roberts
Executive Recruiter
 Corporate Staffing Services
 150 Monument Road, Suite 510
 Bala Cynwyd, PA 19004
 P 610-771-1084
 F 610-771-0390
 E mrobe...@jobscss.com
Check out my recent feature article in Professional Surveyor 12/09
edition. 
http://www.profsurv.com/magazine/article.aspx?i=70379







-Original Message-
From: Bastien Helders [mailto:eldroskan...@gmail.com] 
Sent: Wednesday, March 24, 2010 11:19 AM
To: a...@ashleysheridan.co.uk
Cc: php-general@lists.php.net
Subject: Re: [PHP]Zip and text files generated are corrupted

Hi Ashley,

No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);),
and
the execution stops a long time before the time limit is reached.

It might be relevent that the web application is hosted on a Windows
Machine.

I asked myself, would setting the parameter memory_limit of the
php.ini
file to a higher value help? Actually it is set to 128M. But I actually
don't have problems with creating a zip archive of about 250M (~80
folders),
it actually occurs with 3 times bigger archives.

Best Regards,
Bastien

2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk

  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,

 I've got this web app, which from a list of selected folders (with
content)
 want to create a zip containing them as well as creating a text file
with
 information about the chosen folders and how to use them.

 To create the zip file I use exec('zip -gr ' .$zipname.' * 
mylog.log');
 in the temporary folder where I gathered all the data (using a
zipArchive
 object was more time consuming). I then create the text file using
fopen,
 many fwrites and a fclose.

 My problem is the following, normally it creates the archive and text
file
 without any problem, but as soon as the number of selected folder has
an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and
there
 is an unexpected end of file on both zip and text files.

 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about
that,
 and I don't know where to investigate.

 Regards,
 Bastien


 Is the script maybe running past the max_execution_time before the zip
 files are completed?


   Thanks,
 Ash
 http://www.ashleysheridan.co.uk





-- 
haXe - an open source web programming language
http://haxe.org

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP]Zip and text files generated are corrupted

2010-03-24 Thread Mike Roberts
remove





 Sincerely,

 Michael Roberts
Executive Recruiter
 Corporate Staffing Services
 150 Monument Road, Suite 510
 Bala Cynwyd, PA 19004
 P 610-771-1084
 F 610-771-0390
 E mrobe...@jobscss.com
Check out my recent feature article in Professional Surveyor 12/09 edition. 
http://www.profsurv.com/magazine/article.aspx?i=70379







-Original Message-
From: Richard Quadling [mailto:rquadl...@googlemail.com] 
Sent: Wednesday, March 24, 2010 11:25 AM
To: Bastien Helders
Cc: a...@ashleysheridan.co.uk; php-general@lists.php.net
Subject: Re: [PHP]Zip and text files generated are corrupted

On 24 March 2010 15:19, Bastien Helders eldroskan...@gmail.com wrote:
 Hi Ashley,

 No, I set the time limit high enough (set_time_limit(2*HOUR+8*MINUTE);), and
 the execution stops a long time before the time limit is reached.

 It might be relevent that the web application is hosted on a Windows
 Machine.

 I asked myself, would setting the parameter memory_limit of the php.ini
 file to a higher value help? Actually it is set to 128M. But I actually
 don't have problems with creating a zip archive of about 250M (~80 folders),
 it actually occurs with 3 times bigger archives.

 Best Regards,
 Bastien

 2010/3/24 Ashley Sheridan a...@ashleysheridan.co.uk

  On Wed, 2010-03-24 at 15:34 +0100, Bastien Helders wrote:

 Hi list,

 I've got this web app, which from a list of selected folders (with content)
 want to create a zip containing them as well as creating a text file with
 information about the chosen folders and how to use them.

 To create the zip file I use exec('zip -gr ' .$zipname.' *  mylog.log');
 in the temporary folder where I gathered all the data (using a zipArchive
 object was more time consuming). I then create the text file using fopen,
 many fwrites and a fclose.

 My problem is the following, normally it creates the archive and text file
 without any problem, but as soon as the number of selected folder has an
 high value (let's say about 150 of them), I've got problems with the
 generated files: The zip archive doesn't contain all the folders and there
 is an unexpected end of file on both zip and text files.

 My guess is, as it takes too much time, the script goes on to the next
 operation and close the streams uncleanly. But I can't be sure about that,
 and I don't know where to investigate.

 Regards,
 Bastien


 Is the script maybe running past the max_execution_time before the zip
 files are completed?


   Thanks,
 Ash
 http://www.ashleysheridan.co.uk





 --
 haXe - an open source web programming language
 http://haxe.org



Make sure you have ...

error_reporting(-1); // show ALL errors/warnings/notices/etc.

and ...

exec($Command, $Output, $Status); // Capture the output.
echo The $Command returned a status of $Status and the following
output:, PHP_EOL, implode(PHP_EOL, $Output), PHP_EOL;

sort of thing.

The error may be in the zip.
-- 
-
Richard Quadling
Standing on the shoulders of some very clever giants!
EE : http://www.experts-exchange.com/M_248814.html
EE4Free : http://www.experts-exchange.com/becomeAnExpert.jsp
Zend Certified Engineer : http://zend.com/zce.php?c=ZEND002498r=213474731
ZOPA : http://uk.zopa.com/member/RQuadling

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php