[PHP] Question about exec() call an CLI php script from Apache CGI/filter module PHP
I call red.php through apache http request, and red.php make exec('./s.php') call, then I got bunch of s.php processes, and system locked me out any task due to no more processes available. Why this would happen? marvin:~/liang.ns2user.info/php less red.php ?php ignore_user_abort(true); header(Location: redirect2.html); exec('./s.php'); ? marvin:~/liang.ns2user.info/php less s.php #!/usr/local/bin/php -q ?php echo This is a PHP-CLI Script!!; for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? marvin:~ ps -u liang PID TTY TIME CMD 20368 ?00:00:00 s.php 3184 ?00:00:00 s.php 27596 ?00:00:00 s.php 17509 ?00:00:00 s.php 10471 ?00:00:00 s.php 20244 ?00:00:00 s.php 31762 ?00:00:00 s.php 12067 ?00:00:00 s.php 14125 ?00:00:00 s.php 13003 ?00:00:00 s.php 25257 ?00:00:00 s.php 22482 ?00:00:00 s.php 3117 ?00:00:00 s.php 27203 ?00:00:00 s.php 28537 ?00:00:00 s.php 30534 ?00:00:00 s.php 873 ?00:00:00 s.php 8452 ?00:00:00 s.php 306 ?00:00:00 s.php 15703 ?00:00:00 s.php 24708 ?00:00:00 s.php 5745 ?00:00:00 s.php 11949 ?00:00:00 s.php 34 ?00:00:00 s.php 22545 ?00:00:00 s.php 11775 ?00:00:00 s.php 12333 ?00:00:00 s.php 27383 ?00:00:00 s.php 612 ?00:00:00 s.php 6437 ?00:00:00 s.php 14648 ?00:00:00 s.php defunct 14040 pts/100:00:00 tcsh 14027 pts/100:00:00 ps I am really confused, anybody has any idea? Thank you very much. Liang -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] Question about exec() call an CLI php script from Apache CGI/filter module PHP
I forgot to mention that calling from shell directly s.php runs correctly. ~ marvin:~/liang.ns2user.info/php ./s.php This is a PHP-CLI Script!!0123456789 ~ The php configuration of this site is: http://liang.ns2user.info/php/info.php Thank you in advance. Liang I call red.php through apache http request, and red.php make exec('./s.php') call, then I got bunch of s.php processes, and system locked me out any task due to no more processes available. Why this would happen? marvin:~/liang.ns2user.info/php less red.php ?php ignore_user_abort(true); header(Location: redirect2.html); exec('./s.php'); ? marvin:~/liang.ns2user.info/php less s.php #!/usr/local/bin/php -q ?php echo This is a PHP-CLI Script!!; for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? marvin:~ ps -u liang PID TTY TIME CMD 20368 ?00:00:00 s.php 3184 ?00:00:00 s.php 27596 ?00:00:00 s.php 17509 ?00:00:00 s.php 10471 ?00:00:00 s.php 20244 ?00:00:00 s.php 31762 ?00:00:00 s.php 12067 ?00:00:00 s.php 14125 ?00:00:00 s.php 13003 ?00:00:00 s.php 25257 ?00:00:00 s.php 22482 ?00:00:00 s.php 3117 ?00:00:00 s.php 27203 ?00:00:00 s.php 28537 ?00:00:00 s.php 30534 ?00:00:00 s.php 873 ?00:00:00 s.php 8452 ?00:00:00 s.php 306 ?00:00:00 s.php 15703 ?00:00:00 s.php 24708 ?00:00:00 s.php 5745 ?00:00:00 s.php 11949 ?00:00:00 s.php 34 ?00:00:00 s.php 22545 ?00:00:00 s.php 11775 ?00:00:00 s.php 12333 ?00:00:00 s.php 27383 ?00:00:00 s.php 612 ?00:00:00 s.php 6437 ?00:00:00 s.php 14648 ?00:00:00 s.php defunct 14040 pts/100:00:00 tcsh 14027 pts/100:00:00 ps I am really confused, anybody has any idea? Thank you very much. Liang -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Question on exec() to call a php script from a php script
If my php is configured as ( http://liang.ns2user.info/php/info-train06.htm) an apache 2.0 filter module. The php is invoked by apache on request of http client. Can I do an exec() to call another php script to run in the background? If yes, how? Highly appreciate if somebody helps. Thank you. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] On register_shutdown_function: What might be the problem?
server. Upon next request, the program will only need to make sure the data needed has already stored in the harddrive and transform them and send them back. Since the user harvester agent normally has a 180 second timeout, it is necessary to respond within that period of time. I really need your suggestion. Thank you very much again. Liang Try somthing lik e this: ?php ignore_user_abort(true); header(Location: redirect2.html); echo foo\n; flush(); for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? Liang ZHONG wrote: Sorry, does not seem to work here. The code below takes minutes to show up in browser. Any more suggestion? --- ?php set_time_limit(5); function f(){ set_time_limit(100); $count=5; for($i=0; $i$count; $i++){ } echo end; exec(touch /home/.nappy/liang/liang.ns2user.info/php/aaa); } register_shutdown_function('f'); ignore_user_abort(true); header(Content-type: text/plain); header(Location: y.html); $count=5; for($i=0; $i$count; $i++){ echo \n; } flush(); ? --- Liang If you don't flush some output after setting the header() then the headers won't go out until the end of the request. So do something like: ignore_user_abort(true); header(Location: http://whatever;); echo foo\n; flush(); Then whatever comes after this should run and the browser is long gone. -Rasmus Liang ZHONG wrote: I think I did not express myself clearly. What I want is to be able to redirect user an existing page (let them get it immediately), and to close the connection actively, NOT passively by user abort, at last, to run the function in background. But the redirecting using function header() with location has a problem that header() always does return the page to user after the entire program, including registered showdown function finish running, which is against the will. I put a time consuming task into a function that registered to be a shutdown function and hoping it runs after the user has got the redirected page and the connection has been closed. But experiements (using browsers, curl command line tool as well as perl::LWP code) show that the user got the redirected page only after the shutdown function finished, which is against the description of register_shutdown_function at php website. It seems only header() function use to redirect page has this problem (not executed until register_shutdown_function finished) while other functions like print()/echo(), exec() have not. The code looks like: - ?php set_time_limit(1); function f(){ set_time_limit(20); $count=5000; for($i=0; $i$count; $i++){ } echo end; exec(touch /home/.nappy/liang/liang.ns2user.info/php/aaa); } register_shutdown_function('f'); header(Content-type: text/plain); header(Location: y.html); ? - http client who sends the request to the php program will only get the page back as response after function f finsihes (file aaa created). Changing the $count will make a lot difference. My BIGGEST question is: How to make user get the redirect page immediately after the header() is called, and not until function f() ends, while making sure that the function f() will finally fully (and might slowly) execute? Thank you very much for kindly replying. With high respect, Liang Liang ZHONG wrote: My Question is: What is the correct way to keep the function running after I redirect an existing page to http client (which I want the client get immediately) and then immediately close the connection? ignore_user_abort(true); -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] On register_shutdown_function: What might be the problem?
Thank you for replying. Sorry for being long again. I tried your suggestion of this: ?php ignore_user_abort(true); header(Location: redirect2.html); echo foo\n; flush(); for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? The browser did not get the redirect page until the new $i(10) has been appended to file foo.txt. I also tried --- ob_start(); header(Location: http://liang.ns2user.info/php/y.html;); echo foo\n; ob_end_flush(); --- And does not work. I think it might be the case of what you suggest: If the client sticks around after seeing the redirect and doesn't redirect until after the server has closed the connection, then there is no way to force a close. So may I draw the conclusions: 1. Server in php can not actively close the connection to make the client see the redirect page immediately after the header has been set? 2. The connection is not closed even when the main php end and the shutdown function is running? If above conclusions are not true, could be somewhere that might be wrong due to the apache-php configuration or what? If those are true, I think I have to re-design the project flow control and it will take sometime while I have already finished the code assuming no above problem. Also I think this might result an awkward design for the project. (I mean jsp will be easier to achieve a relatively more elegant design). I use several other program langurages for many years, but new to php, so I probably made some assumption base on my experiences of other langurages but not true. The project I am working now to add enhancements is written on php. I try to keep the main design of it. I did not expect it would be difficult to achieve this: 1. write an xml page 2. let http client get this xml page in time 3. continue do something time consuming now I have difficulty in 2 and 3. My client is normally perl LWP::UserAgent, and sometimes browser. I wonder how to let the client get the page in time before I can continue to finish the rest of the work. Otherwise the client will timeout and get no result. Please help! Thank you. Liang I really didn't follow all that. But this stuff is not that complex. header() sets a header to be sent when output goes out. header() does not send an actual header at that point. If you don't send any output, then the headers won't go out until the script terminates. If you have any sort of output buffering enabled, then even if you think you are sending something, you may only be buffering it, and in that case again the headers won't go out until the request is done. Also, ignore_user_abort() controls whether or not your script will be terminated when we are able to detect that the user has aborted. A user abort is defined as nobody being around to read the data we are sending out. If you don't send anything, we can't detect if the browser has gone away. Generally browsers will redirect as soon as they see a Location: redirect header, but that could be client-speficic. If the client sticks around after seeing the redirect and doesn't redirect until after the server has closed the connection, then there is no way to force a close. If it does redirect and close its end of the connection, then if you set ignore_user_abort(false) your script will be terminated as soon as it tries to send something further and your shutdown function will be called at that point. -Rasmus Liang ZHONG wrote: Hi Rasmus, This may be a little bit long, sorry for taking your time. It still does not work as expected. I tried some experiment, and found that if I called some function or write some code line other then calling header(), the register_shutdown_function and other part of codes work as expected . For example: ?php set_time_limit(5); function f(){ set_time_limit(10); //doing something time consuming } some_function(); ? The time limit of 5 will be the limit of the some_function() and the 10 will be the limit of function f() respectively. Code example: --- ?php set_time_limit(1); ignore_user_abort(true); function say_goodbye() { $st = connection_status(); print Status 1: .$st.\n; set_time_limit(10); $st = connection_status(); print Status 2: .$st.\n; $count=2000; for($i=0; $i$count; $i++){} print End!\n; exec(touch /home/.nappy/liang/liang.ns2user.info/php/bbb); } register_shutdown_function(say_goodbye); print Sleeping...\n; $count=1000; for($i=0; $i$count; $i++){} print Done!\n; ? -bash-2.05b$ curl -N
Re: [PHP] On register_shutdown_function: What might be the problem?
I tested the link (http://lerdorf.com/red.php) using browser (firefox 1.0), it worked as expected, the page of redirect2.html displayed within 2 seconds. I put the exact code to my testing envirionment (2 places), as this one: http://liang.ns2user.info/php/red.php , and the page shows up in about 13 seconds using the SAME browser. Observation 1. Same client(web browser on same mechine), same code on different runing environment make difference. I use perl code (to get page from http://lerdorf.com/red.php) -- #!/usr/bin/perl -w use strict; use LWP::Simple; use LWP::UserAgent; # get input from command line my ($url) = @ARGV; # set default URL if none is given if ( !$url ){ $url = 'http://liang.ns2user.info/liang/y.php'; print STDOUT qq{Using $url as url to fetch\n}; } # get the html page from the web, into a string my $ua = LWP::UserAgent-new(keep_alive=1, timeout=180); my $req = HTTP::Request-new( GET = $url ); my $res = $ua-request( $req ); if(!$res-is_success){ my $error = $res-message(); print STDOUT qq{ERRO: with pid of $$ has ended for the following error: $error}; exit; } my $htmlPage = $res-content; print $htmlPage.\n; exit; -- It takes about 13 seconds to get: ~~ marvin:~/liang.ns2user.info/perl ./http_req.pl http://lerdorf.com/red.php You are now on redirect2.html ~~ Observation 2. = Different clients (browser and perl LWP::UserAgent) request same code on the same mechane result differently. Question: Could it be because of the configuration of linux/apache/php on the server side instead of client side make the difference? (2 of my testing environments are red hat linux with apache 2.0 and one is php5.04 another is 4.3.10.)? The php configuration is: http://liang.ns2user.info/php/info.php. I have no read permission of those httpd.conf files so do not know how apache configured. Any hint? Thank you. Liang That's a client-side issue then, because it is certainly sent. Trying your exact script: ?php ignore_user_abort(true); header(Location: redirect2.html); echo foo\n; flush(); for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? It's at http://lerdorf.com/red.php if you want to test it yourself. 3:53pm colo:/var/www/lerdorf.com telnet localhost 80 Trying 127.0.0.1... Connected to colo. Escape character is '^]'. GET /red.php HTTP/1.0 HTTP/1.1 302 Found Date: Tue, 26 Jul 2005 22:53:40 GMT Server: Apache/1.3.33 (Debian GNU/Linux) PHP/4.4.1-dev X-Powered-By: PHP/4.4.1-dev Location: redirect2.html Connection: close Content-Type: text/html; charset=utf-8 foo 10 second delay here Connection closed by foreign host. -Rasmus Liang ZHONG wrote: Thank you for replying. Sorry for being long again. I tried your suggestion of this: ?php ignore_user_abort(true); header(Location: redirect2.html); echo foo\n; flush(); for($i=0;$i10;$i++) { echo $i; sleep(1); } $fp = fopen(/tmp/foo.txt,a); fputs($fp,$i); fclose($fp); ? The browser did not get the redirect page until the new $i(10) has been appended to file foo.txt. I also tried --- ob_start(); header(Location: http://liang.ns2user.info/php/y.html;); echo foo\n; ob_end_flush(); --- And does not work. I think it might be the case of what you suggest: If the client sticks around after seeing the redirect and doesn't redirect until after the server has closed the connection, then there is no way to force a close. So may I draw the conclusions: 1. Server in php can not actively close the connection to make the client see the redirect page immediately after the header has been set? 2. The connection is not closed even when the main php end and the shutdown function is running? If above conclusions are not true, could be somewhere that might be wrong due to the apache-php configuration or what? If those are true, I think I have to re-design the project flow control and it will take sometime while I have already finished the code assuming no above problem. Also I think this might result an awkward design for the project. (I mean jsp will be easier to achieve a relatively more elegant design). I use several other program langurages for many years, but new to php, so I probably made some assumption base on my experiences of other langurages but not true. The project I am working now to add enhancements is written on php. I try to keep the main design of it. I did not expect it would be difficult to achieve this: 1. write an xml page 2. let http client get this xml page in time 3. continue do something time consuming now I have difficulty in 2 and 3. My client is normally perl LWP::UserAgent, and sometimes browser. I wonder how to let
Re: [PHP] On register_shutdown_function: What might be the problem?
Sorry for bothering again, but I did not mention the other environment on which I tested, since it has an access control to outsider. I saved the info page to: http://liang.ns2user.info/php/info-train06.htm. The php runing on as apache 2.0 filter module. And the resutl of the experiment is same as the CGI SAPI one. Could you please also take a look for this php configration? I also like to have a look at how the environment looks like of http://lerdorf.com/red.php, if you think it is OK. Another question confruse me a lot is why the browser and perl code behave differently in the return respond to the same link http://lerdorf.com/red.php ? The user clients of our project are mainly harvesters which are written in perl (like the one in my previous email). So I really hope the perl code can work out an example then I will be able to find out a workable configuration. I really appreciate all kindly help from you. Liang Liang ZHONG wrote: The php configuration is: http://liang.ns2user.info/php/info.php. I have no read permission of those httpd.conf files so do not know how apache configured. That shows PHP is running as a CGI. As a CGI PHP has very little control over anything. It is completely at the mercy of the web server to do everything. That's likele the source of your problem. You need a better server with PHP running as an Apache module. -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] On register_shutdown_function: What might be the problem?
I think I did not express myself clearly. What I want is to be able to redirect user an existing page (let them get it immediately), and to close the connection actively, NOT passively by user abort, at last, to run the function in background. But the redirecting using function header() with location has a problem that header() always does return the page to user after the entire program, including registered showdown function finish running, which is against the will. I put a time consuming task into a function that registered to be a shutdown function and hoping it runs after the user has got the redirected page and the connection has been closed. But experiements (using browsers, curl command line tool as well as perl::LWP code) show that the user got the redirected page only after the shutdown function finished, which is against the description of register_shutdown_function at php website. It seems only header() function use to redirect page has this problem (not executed until register_shutdown_function finished) while other functions like print()/echo(), exec() have not. The code looks like: - ?php set_time_limit(1); function f(){ set_time_limit(20); $count=5000; for($i=0; $i$count; $i++){ } echo end; exec(touch /home/.nappy/liang/liang.ns2user.info/php/aaa); } register_shutdown_function('f'); header(Content-type: text/plain); header(Location: y.html); ? - http client who sends the request to the php program will only get the page back as response after function f finsihes (file aaa created). Changing the $count will make a lot difference. My BIGGEST question is: How to make user get the redirect page immediately after the header() is called, and not until function f() ends, while making sure that the function f() will finally fully (and might slowly) execute? Thank you very much for kindly replying. With high respect, Liang Liang ZHONG wrote: My Question is: What is the correct way to keep the function running after I redirect an existing page to http client (which I want the client get immediately) and then immediately close the connection? ignore_user_abort(true); -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
Thank you Richard, I think I'd better explain a little about the project and then you or somebody else might give some good suggestion upon the restrictions of the project. The project is to implement a digital library protocol, called oai_pmh (http://www.openarchives.org/OAI/openarchivesprotocol.html) It acts as a broker for the harvester client to get meta data from libraries that have Z39.50 server. The database resides on libraries, and vary alot in speed, number of records, way to accept connection from z39.50 client. The number of records from some libraries might be over million. So the part that getting data from those libraries behave very differently. The harvester client sends http request, normally through program, like perl LWP. It normally sets 180 time out for connection. According to the protocol, the oai_pmh data provider act on respond to harvester http request, it begin to connect to specific library's z39.50 server, get data in, write them to disk, and translate to another xml format, then send to harvester client. If the records are too many, oai_pmh should send back partial data with resumption token. The harvester can later send out http request with same url but with the resumption token as one of the POST variable to get further data records. This process can be continueing till all the records has been send. Thus I noramlly use perl program to send the http request and get content instead of BROWSER. The behavior of buffer should not due to the setting of the browser. I can not echo the metadata directly back, since xlst need to use to transform and new xml file(s) are writen. The header() redirction is very nature to use if it can close the connection before I do something very time consumming after that. The exec with and the cron job are hard to use, since connection to z39.50 with a lot of state variables like connection id, etc can not easily be passed to another script. The harvester user normally is not a human with browser but a piece of code, while looping and sending out http requests if the page it gets back has resumptionToken tag. (it replace the element between open and close tag of resumptionToken, append to the next http request as POST variable for next records page). But the problem is each http request posts a timeout of 180 seconds. Thus I have to return partical data within 3 minutes while the whole process might take hours or even days. Then the process continue to get data from library server and transform it, then write to disk in a particular directory. The next request with resumption token comes in, the program will check for the existing of the directory and return if yes. If not existing, program will check to return within 3 minutes or send back not available information. Sorry for the long writting. I hope some one has some suggestion for me. Thank you very much. --- I now encounter a problem with flow control of my program with PHP. This is very crucial to the design of a pretty big project. This is what I want to do in the program: ?php do_A(); header(Location: .$result_of_do_A); Depending on the buffering options in php.ini and/or Apache, this may or may not just end your program, as I understand it. Once you send the Location header, everything else is irrelevant, or at least not reliable. You could do: echo $result_of_do_A; flush(); and the user will see what happened with A, while waiting for B. do_B(); ? Since it takes do_B() quite a while to finish, so I want the http client get the partial result from do_A() by redirect a page to them before start do_B(). But it seems that the redirection will only occure after the entire php program finishes, i.e., after do_B(). I sent http request through browser, curl comman line with -N (no buffer) option and with a perl LWP program I wrote. All of them suggest that header(), although is put before do_B() in code, gets executed only after all the php code finished. I add flush() after header() too, but no work. If that is what you are seeing happen, you probably have output buffering turned on The Location: header is acted upon by the BROWSER, not by PHP, not by your server. The BROWSER sees that header and then jumps to somewhere else. My question is: Is there any way that I can return to the client though http response and then continue my progress with my program? You could also look into the pcntl stuff to fork() or, depending on various settings, you might get: exec(do_B() ); to get B to happen in the background. With all that said: As a general rule, when I found myself doing this kind of stuff, I later realized that I hadn't really designed my application very well for an end-user experience. If it takes THAT long to finish B, then you're probably MUCH better off putting something in a ToDo List in your database, and
[PHP] On register_shutdown_function: What might be the problem?
?php set_time_limit(0); function f(){ $count=1000; for($i=0; $i$count; $i++){ } exec(touch /tmp/ccc); } register_shutdown_function('f'); header(Content-type: text/plain); header(Location: y.html); ? When the time_limit is set to 0, the redirect page will be shown in 20 second after the file ccc is created. When the time_limit is set to 5, the redirect page will be shown in 5 second and the ccc file is not created. The error from curl command line tool is as: --- br / bFatal error/b: Maximum execution time of 5 seconds exceeded inb/./y.php/b on line b6/bbr / What might be the problem that my register shudown function can not continuously run after the main program end? What is the correct way to keep the function running after I redirect a existing page to http client and then immediately close the connection? Thank you very much for your help. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] On register_shutdown_function: What might be the problem?
I want the http client see the page ( here y.html) immediately after I call header function, and then close the connectiion, and the function f running in the background. But trying many many times, the result seems that I have to either set the time limit to small to send the the html page sooner but also terminate the background running function which was registered as a shutdown function, or set the time to long enough for the shutdown function to finish while keeping the http client waiting until function return. I doubt that register_shutdown_function meant to behave this way by its design. I would like to find out what could be wrong of my code, configuration of php or apache? My Question is: What is the correct way to keep the function running after I redirect an existing page to http client (which I want the client get immediately) and then immediately close the connection? Thank you very much. Given it's a fatal error, it's as bad as a syntax error. It cancels everything it's doing and leaves. Remember that, even the shutdown function has to obey the time limit. On 7/22/05, Liang ZHONG [EMAIL PROTECTED] wrote: ?php set_time_limit(0); function f(){ $count=1000; for($i=0; $i$count; $i++){ } exec(touch /tmp/ccc); } register_shutdown_function('f'); header(Content-type: text/plain); header(Location: y.html); ? When the time_limit is set to 0, the redirect page will be shown in 20 second after the file ccc is created. When the time_limit is set to 5, the redirect page will be shown in 5 second and the ccc file is not created. The error from curl command line tool is as: --- br / bFatal error/b: Maximum execution time of 5 seconds exceeded inb/./y.php/b on line b6/bbr / What might be the problem that my register shudown function can not continuously run after the main program end? What is the correct way to keep the function running after I redirect a existing page to http client and then immediately close the connection? Thank you very much for your help. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
I now encounter a problem with flow control of my program with PHP. This is very crucial to the design of a pretty big project. This is what I want to do in the program: ?php do_A(); header(Location: .$result_of_do_A); do_B(); ? Since it takes do_B() quite a while to finish, so I want the http client get the partial result from do_A() by redirect a page to them before start do_B(). But it seems that the redirection will only occure after the entire php program finishes, i.e., after do_B(). I sent http request through browser, curl comman line with -N (no buffer) option and with a perl LWP program I wrote. All of them suggest that header(), although is put before do_B() in code, gets executed only after all the php code finished. I add flush() after header() too, but no work. My question is: Is there any way that I can return to the client though http response and then continue my progress with my program? Thank you very much for your kindly help. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
Thank you Rouvas, I never used the tools you mentioned. But I will definitely give them a try. I wrote a perl script using LWP as an http user agent. and the timing function you suggested. It works well. I am new to this forum, and new to PHP. It seems Rasmus is famous, and I should have shown my respect. Thanks again all of you for your kindly help. Hi Liang, trying to get conclusive results with browsers is futile. Use a command-line tool (like curl) to invoke the web pages and get the results. Or you can use PHP's own function to query the web server and do your own timing with microtime() function or another suitable for your purposes. In order for flush() results to reach you (in a browser) they have to pass from multiple caches like PHP's, Apache's, the occasional proxies and finally the browser's own cache. So you cannot get dependaple results measuring times or responses from your browser. Try the methods above. And a final tip... When Rasmus speaks, you don't question him:-) Period. Have a nice day, -Stathis -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
Hi André, It sounds interesting. But since I am pretty new to PHP, I have some questions, naive maybe, about what you wrote. #!/bin/sh\n/path/to/script/Send.php 12 \n What does the Send.php look like? I do not have idea how a shell interprets an php script and what the parameter 12 means here. If you do not mind, could you please also let me look at your Send.php? Thank you very much. I did something like that for a newsletter sending script. Basiclly, I had two scripts: a) AddEdit.php that would list the newsletter's items and allow it to send b) Send.php that was a script I ran on the background When pressed Send on AddEdit, it would do something like $tempName = tempnam( '/tmp', 'newsletter' ); $fp = fopen( $tempName, 'w+' ); fputs( $fp, #!/bin/sh\n/path/to/script/Send.php 12 \n ); fclose( $fp ); chmod( $tempName, 0755 ); system( $tempName . ' ' ); That way, it would launch the second script into the background, checking if the script altered the newsletter's state for Sent everytime a user saw the newsletter's details. Hope it helped. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] Question about apache-php concurrent process control
Can somebody here help me to delete my message? People who read this must be laughing at me. :) snipIt seems Rasmus is famous, and I should have shown my respect./snip history src = 'php_manual_en.chm' PHP succeeds an older product, named PHP/FI. PHP/FI was created by Rasmus Lerdorf in 1995, initially as a simple set of Perl scripts for tracking accesses to his online resume. He named this set of scripts 'Personal Home Page Tools'. As more functionality was required, Rasmus wrote a much larger C implementation, which was able to communicate with databases, and enabled users to develop simple dynamic Web applications. Rasmus chose to release the source code for PHP/FI for everybody to see, so that anybody can use it, as well as fix bugs in it and improve the code. /history This message has been delivered to the Internet by the Revenue Internet e-mail service * -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Question about apache-php concurrent process control
I am a programmer and new to php. I wonder what process control can php interpreter do for multithreading . I notice that through http request, php interpreter can execute 2 php programs simataneously, but will only sequentially queued and execute if I try to execute one php code in the same time. Can somebody tell me is it a problem by php design not to be execute code concurrently, or some apache/php configuration problem where I can fix? Thank you very much -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
As I know, apache-php works this way. When the web server gets an http request with file name extension .php, it will start the php interpreter to process the php file. Now I have a php program does something like this: when it is executing with one parameter p1, the program code goes to contact a database, gets back huge amount of data, writes it to file system. This will take pretty long time to finish. In the mean time, the same php code will be invoked by apache with another request, passing different parameter p2, indicates the code need to run to check the availability and read partial of the data which has already writen to the file system just now. The program location need to be in the same URL, but with only different parameter passing to it. for example: http://baseURL/a.php?v=p1 http://baseURL/a.php?v=p2 I want these 2 process can be run concurrently, but through a small test, I found out that they just sequentially execute one after the other. http://baseURL/a.php http://baseURL/b.php howevery can run concurrently well, it means that apache can invoke multiple php interpreter processes. But I do not know how to make php interpreter to do the same thing. Or 2 php interpreter processes process one php program sepreately without interfer with each other. Hope I have explained myself clearly. Thank you. From: Rasmus Lerdorf [EMAIL PROTECTED] To: Liang [EMAIL PROTECTED] CC: php-general@lists.php.net Subject: Re: [PHP] Question about apache-php concurrent process control Date: Fri, 15 Jul 2005 13:46:37 -0700 Liang wrote: I am a programmer and new to php. I wonder what process control can php interpreter do for multithreading . I notice that through http request, php interpreter can execute 2 php programs simataneously, but will only sequentially queued and execute if I try to execute one php code in the same time. Can somebody tell me is it a problem by php design not to be execute code concurrently, or some apache/php configuration problem where I can fix? Apache is a multi-process single-threaded architecture. 1 request, 1 process. Concurrent requests are handled by concurrent processes. What is it you think you need a second thread for during a request? -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
Could you please explain it a little bit more? I did test this way. The code is the same for a.php and b.php ?php sleep(20); print Done. br /; ? I place request from 2 browser windows. First time, I placed with http://baseURL/a.php with both 2 browsers, starting times have 5 second interval. Then the first Done shows after 20 seconds and the second Done shows 20 seconds after the first Done. Then, I placed one browser with http://baseURL/a.php and the second one with http://baseURL/b.php, with starting time of 5 second interval. Then I got the first browser showing Done after 20 seconds and 5 seconds later, the second browser showed Done, too. Thus it seems that the apache can spoon out multiple php interpreters responding to http requests, while php can not deal with concurrent process from one program. I do not know if it is due to the php's design limitation, or I did not configure the php correctly to fulfill its full functionality? Thank you Liang ZHONG wrote: As I know, apache-php works this way. When the web server gets an http request with file name extension .php, it will start the php interpreter to process the php file. Now I have a php program does something like this: when it is executing with one parameter p1, the program code goes to contact a database, gets back huge amount of data, writes it to file system. This will take pretty long time to finish. In the mean time, the same php code will be invoked by apache with another request, passing different parameter p2, indicates the code need to run to check the availability and read partial of the data which has already writen to the file system just now. The program location need to be in the same URL, but with only different parameter passing to it. for example: http://baseURL/a.php?v=p1 http://baseURL/a.php?v=p2 I want these 2 process can be run concurrently, but through a small test, I found out that they just sequentially execute one after the other. Sorry, your test is wrong then. Apache/PHP does not serialize requests like that. If somehow the backend you are collecting data from is serializing the requests, then you are out of luck and frontend threads isn't going to solve that since your threads would simply be serialized as well. You are on the wrong track here. -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Question about apache-php concurrent process control
Hi Rasmus, You are right. It was the problem with the browser. I used Mozilla Firefox to try, and do not know what consideration it just serialized the identical url http requests. I then turned to use 2 IE 6.0 windows, 2 tabs within Maxthon browser, one IE windows and one Firefox, to test. Then I got the conclusion as you told. Thank you very much for the help. BTW, I could not get the flush() work, neither flush() with ob_flush(). I tried almost all methods mentioned in the followed posts under http://us2.php.net/flush, but none of them can really pushed the buffer out. The site is configurated with http://liang.ns2user.info/php/info.php on Red head, kernel 2.4.29. What can I do to get it work? Thank you again. Liang ZHONG wrote: Could you please explain it a little bit more? I did test this way. The code is the same for a.php and b.php ?php sleep(20); print Done. br /; ? I place request from 2 browser windows. First time, I placed with http://baseURL/a.php with both 2 browsers, starting times have 5 second interval. Then the first Done shows after 20 seconds and the second Done shows 20 seconds after the first Done. Then, I placed one browser with http://baseURL/a.php and the second one with http://baseURL/b.php, with starting time of 5 second interval. Then I got the first browser showing Done after 20 seconds and 5 seconds later, the second browser showed Done, too. Thus it seems that the apache can spoon out multiple php interpreters responding to http requests, while php can not deal with concurrent process from one program. I have no idea what you did to configure it this way. I wouldn't even know how to do that if you asked me to. As far as PHP is concerned it has no idea which processes are handling which script files at any one point. So whether you request a.php and b.php at the same time or a.php twice at the same time, it makes absolutely no difference to PHP. If you are really seeing this, then the limitation is in your browser or somewhere else. Try making a.php output stuff so it becomes easier to see. As in for($i=0;$i20;$i++) { echo $i; flush(); sleep(1); } You should see the counter start counting as soon as you hit the page. -Rasmus -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php