[PHP] Re: PHP/Apache: script unexpectedly invoked multiple times in parallel every 30 secs.

2009-03-11 Thread Martin Zvarík

Marc Venturini napsal(a):

Hi all,

I wrote a PHP script running in Apache which takes more than 30 seconds to
complete. It uses set_time_limit() to extend the time it is allowed to run.
The script generates thumbnails from a list of images. Upon completion, the
script redirects the browser to another page using HTTP headers.


If you die() at the end of the script and don't redirect does it 
continue this auto-30-seconds execution?




On my local machine (Windows + EasyPHP), the script runs as expected and
completes after a few minutes.

I observe an unexpected behavior on my production web server:
- The script runs as expected for the first 30 seconds.
- After 30 seconds, the same script with the same parameters starts again in
a new thread/process. The initial thread/process is *not* interrupted, so 2
threads/processes run in parallel, executing the same sequence of operations
with a 30 time shift.
- The same scenario happens every 30 seconds (i.e.: at 030, 100, 130, and
so on), multiplying the parallel threads/processes.



- The browser keeps on loading while the above happens.



- After some time, the browser displays a blank page and all the
threads/processes stop. I assume this is due to resources exhaustion, but I
have no means to check this assumption.

I deduced the above reading a text file in which I log the sequence of
called functions.


It all seems as a redirection / unclosed loop problem.



Unfortunately I have no access *at all* to my production web server
configuration (shared hosting, no documentation). I cannot even read the
configuration settings. While I'm considering moving to another host, I'd be
extremely pleased to have an explanation of the observed behavior.

I have browsed the mailing list archives and looked for an explanation in
other forums to no avail. This thread may deal with the same issue but does
not include any explanation or solution:
http://www.networkedmediatank.com/showthread.php?tid=17140

Thanks for reading, and please do not hesitate to ask for further
explanations if what I'm trying to achieve was not clear!


Why it works on your local server is probably caused by different 
versions/settings, but I bet there's an error somewhere in your script.


Consider sending it here, I'll take a look.



Cheers,
Marc.



Martin

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Re: PHP/Apache: script unexpectedly invoked multiple times in parallel every 30 secs.

2009-03-11 Thread haliphax
On Wed, Mar 11, 2009 at 10:30 AM, Martin Zvarík mzva...@gmail.com wrote:
 Marc Venturini napsal(a):

 Hi all,

 I wrote a PHP script running in Apache which takes more than 30 seconds to
 complete. It uses set_time_limit() to extend the time it is allowed to
 run.
 The script generates thumbnails from a list of images. Upon completion,
 the
 script redirects the browser to another page using HTTP headers.

 If you die() at the end of the script and don't redirect does it continue
 this auto-30-seconds execution?


 On my local machine (Windows + EasyPHP), the script runs as expected and
 completes after a few minutes.

 I observe an unexpected behavior on my production web server:
 - The script runs as expected for the first 30 seconds.
 - After 30 seconds, the same script with the same parameters starts again
 in
 a new thread/process. The initial thread/process is *not* interrupted, so
 2
 threads/processes run in parallel, executing the same sequence of
 operations
 with a 30 time shift.
 - The same scenario happens every 30 seconds (i.e.: at 030, 100, 130,
 and
 so on), multiplying the parallel threads/processes.

 - The browser keeps on loading while the above happens.

 - After some time, the browser displays a blank page and all the
 threads/processes stop. I assume this is due to resources exhaustion, but
 I
 have no means to check this assumption.

 I deduced the above reading a text file in which I log the sequence of
 called functions.

 It all seems as a redirection / unclosed loop problem.


 Unfortunately I have no access *at all* to my production web server
 configuration (shared hosting, no documentation). I cannot even read the
 configuration settings. While I'm considering moving to another host, I'd
 be
 extremely pleased to have an explanation of the observed behavior.

 I have browsed the mailing list archives and looked for an explanation in
 other forums to no avail. This thread may deal with the same issue but
 does
 not include any explanation or solution:
 http://www.networkedmediatank.com/showthread.php?tid=17140

 Thanks for reading, and please do not hesitate to ask for further
 explanations if what I'm trying to achieve was not clear!

 Why it works on your local server is probably caused by different
 versions/settings, but I bet there's an error somewhere in your script.

 Consider sending it here, I'll take a look.

A blank URL does not redirect to the directory index, IIRC... it
refreshes the current page (such as a FORM tag with ACTION=). This
may very well still be your problem.

My 2c,


-- 
// Todd

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Re: PHP/Apache: script unexpectedly invoked multiple times in parallel every 30 secs.

2009-03-11 Thread Marc Venturini
Hi all,

Thank you all very much for your contributions.

I tried to monitor the network with Wireshark: there is only one request
from my browser to the server, and not any answer (redirect or otherwise).
This means the problem is definitely not with unexpected browser requests.

Calling die() at the end of the script and removing the redirect did not
change the behavior in any way.

I like to think my code is good, and that the server calls the script in an
unexpected way. The main reason for this belief is that I do not use
multithreading at all, while the logs report the script is running several
times in parallel and the network monitor reports a single browser request.
I could not find in the docs any server configuration parameter which would
re-invoke a script without killing its currently running instance.

Unfortunately I cannot post the code here, as everything is spread in
several files. If stuck with this issue for too long, I'll consider reducing
the script to the root cause of the problem and posting it, but it's toomuch
overhead at this stage...

Cheers,
Marc.


On Wed, Mar 11, 2009 at 4:40 PM, haliphax halip...@gmail.com wrote:

 On Wed, Mar 11, 2009 at 10:30 AM, Martin Zvarík mzva...@gmail.com wrote:
  Marc Venturini napsal(a):
 
  Hi all,
 
  I wrote a PHP script running in Apache which takes more than 30 seconds
 to
  complete. It uses set_time_limit() to extend the time it is allowed to
  run.
  The script generates thumbnails from a list of images. Upon completion,
  the
  script redirects the browser to another page using HTTP headers.
 
  If you die() at the end of the script and don't redirect does it continue
  this auto-30-seconds execution?
 
 
  On my local machine (Windows + EasyPHP), the script runs as expected and
  completes after a few minutes.
 
  I observe an unexpected behavior on my production web server:
  - The script runs as expected for the first 30 seconds.
  - After 30 seconds, the same script with the same parameters starts
 again
  in
  a new thread/process. The initial thread/process is *not* interrupted,
 so
  2
  threads/processes run in parallel, executing the same sequence of
  operations
  with a 30 time shift.
  - The same scenario happens every 30 seconds (i.e.: at 030, 100, 130,
  and
  so on), multiplying the parallel threads/processes.
 
  - The browser keeps on loading while the above happens.
 
  - After some time, the browser displays a blank page and all the
  threads/processes stop. I assume this is due to resources exhaustion,
 but
  I
  have no means to check this assumption.
 
  I deduced the above reading a text file in which I log the sequence of
  called functions.
 
  It all seems as a redirection / unclosed loop problem.
 
 
  Unfortunately I have no access *at all* to my production web server
  configuration (shared hosting, no documentation). I cannot even read the
  configuration settings. While I'm considering moving to another host,
 I'd
  be
  extremely pleased to have an explanation of the observed behavior.
 
  I have browsed the mailing list archives and looked for an explanation
 in
  other forums to no avail. This thread may deal with the same issue but
  does
  not include any explanation or solution:
  http://www.networkedmediatank.com/showthread.php?tid=17140
 
  Thanks for reading, and please do not hesitate to ask for further
  explanations if what I'm trying to achieve was not clear!
 
  Why it works on your local server is probably caused by different
  versions/settings, but I bet there's an error somewhere in your script.
 
  Consider sending it here, I'll take a look.

 A blank URL does not redirect to the directory index, IIRC... it
 refreshes the current page (such as a FORM tag with ACTION=). This
 may very well still be your problem.

 My 2c,


 --
 // Todd

 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] Re: PHP/Apache: script unexpectedly invoked multiple times in parallel every 30 secs.

2009-03-11 Thread Nathan Rixham

Marc Venturini wrote:

Hi all,

Thank you all very much for your contributions.

I tried to monitor the network with Wireshark: there is only one request
from my browser to the server, and not any answer (redirect or otherwise).
This means the problem is definitely not with unexpected browser requests.

Calling die() at the end of the script and removing the redirect did not
change the behavior in any way.

I like to think my code is good, and that the server calls the script in an
unexpected way. The main reason for this belief is that I do not use
multithreading at all, while the logs report the script is running several
times in parallel and the network monitor reports a single browser request.
I could not find in the docs any server configuration parameter which would
re-invoke a script without killing its currently running instance.



are you forking the script at all? if so you can't unless on the cli

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Re: PHP/Apache: script unexpectedly invoked multiple times in parallel every 30 secs.

2009-03-11 Thread Michael A. Peters

Nathan Rixham wrote:

Marc Venturini wrote:

Hi all,

Thank you all very much for your contributions.

I tried to monitor the network with Wireshark: there is only one request
from my browser to the server, and not any answer (redirect or 
otherwise).
This means the problem is definitely not with unexpected browser 
requests.


Calling die() at the end of the script and removing the redirect did not
change the behavior in any way.

I like to think my code is good, and that the server calls the script 
in an

unexpected way. The main reason for this belief is that I do not use
multithreading at all, while the logs report the script is running 
several
times in parallel and the network monitor reports a single browser 
request.
I could not find in the docs any server configuration parameter which 
would

re-invoke a script without killing its currently running instance.



are you forking the script at all? if so you can't unless on the cli



I don't know what is causing it, but is the site live? If so, could it 
be a proxy somewhere re-requesting the data when it thinks your server 
has timed out? I guess you ruled that out with the wireshark.


If it really takes over 30 seconds to process the images, would it 
better to just have your script cue the images and exit, with 
ImageMagick running on the server to do the actual hard work?


Write a shell script that cron runs every 5 minutes.
The script wgets a shell script from your server with the cue of what 
needs to be processed and then processes it.


I don't know for sure, but I suspect using ImageMagick in a shell script 
is going to be less resource intensive than the web server doing it. 
Doing it that way lets your script exit much sooner and would avoid 
impatient user reloads, which could be a problem even when you do figure 
out this issue.


I almost wonder if Apache has some directive that tries to serve the 
data again if it thinks there was a backend problem with it's first request.


What happens when you try to request your page with wget or lynx?
They won't try to load any images, so if there is a image src problem 
that should make it obvious.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php