Re: [PHP] MySQL to blame? (was Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?)
M5 wrote: On 20-Dec-07, at 1:17 AM, Per Jessen wrote: René Fournier wrote: I'm really not sure what to try next. ps -aux shows MySQL as hogging the CPU, not PHP or Terminal: When this happens, do a 'SHOW PROCESSLIST' in mysql to see what it's doing. I have, and I can't see anything unusual. There are a few scripts that loop with very slow overhead (with sufficient sleep(), etc.) plus a few outside HTTP requests. Nothing unusual. Incidentally, yesterday, when MySQL went to 80-90% again after a week, I let it stay there while I poked around MySQL (doing the above) and the OS to see if there are some magical limit that I might be breaking. So it the server ran with MySQL at 80-90% CPU for about eight hours. Everything still worked fine, scripts ran, the database was available. That's the thing about this problem--it's not a show-stopper, it's just really strange. And I can't figure its source. Check your mysql logs for just before this time and see if there are any queries there that need attention. Do you have the slow-queries log enabled? Also make sure you have the option to log queries that don't use an index turned on. See if anything there gives you some clues. Are you committing a big transaction (thousands of records or something)? Or do you have a transaction idling and not committing or rolling back? Are you replicating data to another server and this is triggering the problem? Or a backup is running? It doesn't shutdown. Finally--and I really hate doing this, because it sees dangerous to data (is it?)--I issue a kill -9 command to its process. I'd say it's very dangerous to do that to a database but ask the mysql list, they will have better insight into what this does. -- Postgresql php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
René Fournier wrote: I'm really not sure what to try next. ps -aux shows MySQL as hogging the CPU, not PHP or Terminal: When this happens, do a 'SHOW PROCESSLIST' in mysql to see what it's doing. /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] MySQL to blame? (was Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?)
On 20-Dec-07, at 1:17 AM, Per Jessen wrote: René Fournier wrote: I'm really not sure what to try next. ps -aux shows MySQL as hogging the CPU, not PHP or Terminal: When this happens, do a 'SHOW PROCESSLIST' in mysql to see what it's doing. I have, and I can't see anything unusual. There are a few scripts that loop with very slow overhead (with sufficient sleep(), etc.) plus a few outside HTTP requests. Nothing unusual. Incidentally, yesterday, when MySQL went to 80-90% again after a week, I let it stay there while I poked around MySQL (doing the above) and the OS to see if there are some magical limit that I might be breaking. So it the server ran with MySQL at 80-90% CPU for about eight hours. Everything still worked fine, scripts ran, the database was available. That's the thing about this problem--it's not a show- stopper, it's just really strange. And I can't figure its source. After not finding anything, I decided restart the script. So I stop the [seemingly offending] script and wait for CPU load to return to normal. It doesn't. MySQL remains at 80-90%. Even with all the other processes turned off that call MySQL and Web Server off, MySQL remains at 80-90%. Yet SHOW PROCESSES lists no processes, just the show processlist command I issue. With the load still high, I attempted to Stop MySQL via the Adminstrator control panel. I waited a few minutes. It doesn't shutdown. Finally--and I really hate doing this, because it sees dangerous to data (is it?)--I issue a kill -9 command to its process. Then it starts fine, I start the script in question, and everything is back to normal. ...Rene -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
On 11-Dec-07, at 2:13 PM, Per Jessen wrote: René Fournier wrote: However, the number of socket clients connecting in the past 3-4 months has steadily increased, and this seems to have exposed (if not created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure where the problem's cause really lies.) Basically, after the script has been running for a day or so (processing essentially the amount data that used to take two weeks), the CPU usage of the machine goes from 30% (normal) to 80-90%. Have you tried stracing it to see what's really happening when the load goes that high? Good advice, since I think there's nothing left for me to do but inspect the MySQL process. Incidentally, I've made some changes to the script a week ago, which has seemed to improve the situation somewhat. Now, the script has run for nearly 7 days without interruption or high CPU load. Problem solved? Again this morning, I noticed CPU went up to 90% and is staying there. (Previously, this would happen after 1-2 days.) The number of distinct MySQL connections remains low, since the script (which runs in a loop, with a sleep(1) and a timeout on the stream_socket_select()), only creates one MySQL connection in the beginning. All MySQL queries run through that. The script has run for 6-7 days, during which time it's executed 2.7 million queries (mostly SELECTs) and created 105,198 external, short- lived child processes (each lasts about a second or two, then closes after mysql_close())--I don't think this is an issue. Memory usage seems okay. By the end of 7 days, the script's memory usage has peaked at 4MB (out of 16MB max). Typically it's around 3MB. MySQL memory usage doesn't seem to be a constraint. I'm using my- huge.cnf, and the nature of the queries is fairly regular. I would say that the database structure is not an issue (though I could be wrong!)--everything is pretty well normalized and indexed. I'm logging slow queries, non-indexed SELECTs, etc. I'm really not sure what to try next. ps -aux shows MySQL as hogging the CPU, not PHP or Terminal: PID COMMAND %CPU TIME #TH #PRTS #MREGS RPRVT RSHRD RSIZE VSIZE 342 mysqld 83.3% 16:13:19 33 125 139 435M 4.75M 439M 539M 385 Terminal 4.7% 5:36:35 22 184 251 20.1M 33.5M 28.8M+ 256M 1190 php 4.3% 3:13:33 1 15 148 6.51M 8.15M 12.1M 89.0M 0 kernel_tas 1.3% 2:02:40 47 2 619 5.00M- 0B 219M- 1.26G- It's really strange and strangely consistent. The script will run for a few million cycles, whereupon MySQL suddenly uses 50% more CPU. But for what? I'm looking at tutorials on ktrace and kdump to see what I can learn from MySQL. I wonder if I would have this problem under Linux... ...Rene
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Jochem Maas wrote: Have you tried stracing it to see what's really happening when the load goes that high? am I correct that that would be done like so?: strace -p process id of php deamon Yep, that's it. You'll probably want to record the output for analysis, but sometimes it's very obvious what's happening. Which doesn't mean it's also easy to fix, but it could give you a clue. /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
M5 wrote: Thanks Jim. No problem. The processing is pretty quick. I don't think that's a bottleneck. It basically just inserts the data into MySQL, not much processing actually. What is the likely hood that two connections would come in at the same time, or at least within close enough time that the second would have to wait for the first to finish its job? --PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- Jim Lucas Perseverance is not a long race; it is many short races one after the other Walter Elliot Some men are born to greatness, some achieve greatness, and some have greatness thrust upon them. Twelfth Night, Act II, Scene V by William Shakespeare -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
stream_socket_server simply listens, stream_socket_accept handles the connection, stream_set_write_buffer and stream_set_blocking help you keep up, especially when combined with stream_get_line, no need to shile forever when you can just: while (is_resource($conn = stream_socket_accept($socket, -1))) while (is_resource($conn) $pkt = stream_get_line($conn, 100, \n)) Key I find though is multithreading, listener thread with stream_socket_server, 2 or 3 stream_socket_accept threads and a pair of new thread spawned to handle each connection (one to read, one to write) (not needed for stateless http style request processing). Nathan M5 wrote: Curiously, would you agree with this guy's comments concerning low-level PHP socket functions vs stream_socket_server() ? If you want a high speed socket server, use the low-level sockets instead (socket_create/bind/listen). The stream_socket_server version appears to have internal fixed 8k buffers that will overflow if you don't keep up by reading. This is a serious problem if you an application that reads the socket for messages and then, say, saves the result in a database. The delay while it is busy processing means you can't read the data in time unless you get involved in muti-threading. With the the low-level functions, the OS quietly buffers TCP/IP packets so there is no problem (tested on Windows XP Professional). (http://www.php.net/manual/en/function.stream-socket-server.php#67837) On 10-Dec-07, at 9:46 PM, Tom Rogers wrote: Hi, Tuesday, December 11, 2007, 10:01:38 AM, you wrote: RF On 10-Dec-07, at 4:42 PM, Tom Rogers wrote: Put a usleep(1000) in the listen while() loop and give the cpu a break. RF Good advice, but I've already been doing that. The thing is, when the RF script first starts up, the CPU rarely exceeds 30%, even when many RF clients (200+) are simultaneously connected and sending data. When a RF few clients are connected, CPU is typically below 10%. Again, it's RF only after 24-48 hours that, all of a sudden, CPU usage increases by RF 40-50%. And it stays high until I stop the script and restart it. RF One question I have though is, is there actually any benefit to using RF mysql_pconnect(), since the script simply loops? My understanding is RF that pconnect only benefits if a script would otherwise be using RF mysql_connect repeatedly--and this script doesn't, since it calls RF mysql_[p]connect() just once, in the start tof execution. RF ...Rene I have found pconnect to be a problem (several years ago) and have never tried it since, it may well be ok now. The most likely cause is memory consumption on long running php scripts, what does top say? I have a script which runs from cron and was hammering the system when it ran and i have had to put the usleep() in the while($result = ..) loop as there are a few thousand rows. Probably bad design but it works and I'm loath to touch it :) One way to solve the memory issue is to have the script started by inetd, slower but more memory friendly. Also have a look at memcached to reduce the load a bit. -- regards, Tom -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
hi Nathan, any chance of a 'full blown' example for all the muppets who want to try and grok this stuff? (bork bork, say I :-)) Nathan Rixham wrote: stream_socket_server simply listens, stream_socket_accept handles the connection, stream_set_write_buffer and stream_set_blocking help you keep up, especially when combined with stream_get_line, no need to shile forever when you can just: while (is_resource($conn = stream_socket_accept($socket, -1))) while (is_resource($conn) $pkt = stream_get_line($conn, 100, \n)) Key I find though is multithreading, listener thread with stream_socket_server, 2 or 3 stream_socket_accept threads and a pair of new thread spawned to handle each connection (one to read, one to write) (not needed for stateless http style request processing). Nathan M5 wrote: Curiously, would you agree with this guy's comments concerning low-level PHP socket functions vs stream_socket_server() ? If you want a high speed socket server, use the low-level sockets instead (socket_create/bind/listen). The stream_socket_server version appears to have internal fixed 8k buffers that will overflow if you don't keep up by reading. This is a serious problem if you an application that reads the socket for messages and then, say, saves the result in a database. The delay while it is busy processing means you can't read the data in time unless you get involved in muti-threading. With the the low-level functions, the OS quietly buffers TCP/IP packets so there is no problem (tested on Windows XP Professional). (http://www.php.net/manual/en/function.stream-socket-server.php#67837) On 10-Dec-07, at 9:46 PM, Tom Rogers wrote: Hi, Tuesday, December 11, 2007, 10:01:38 AM, you wrote: RF On 10-Dec-07, at 4:42 PM, Tom Rogers wrote: Put a usleep(1000) in the listen while() loop and give the cpu a break. RF Good advice, but I've already been doing that. The thing is, when the RF script first starts up, the CPU rarely exceeds 30%, even when many RF clients (200+) are simultaneously connected and sending data. When a RF few clients are connected, CPU is typically below 10%. Again, it's RF only after 24-48 hours that, all of a sudden, CPU usage increases by RF 40-50%. And it stays high until I stop the script and restart it. RF One question I have though is, is there actually any benefit to using RF mysql_pconnect(), since the script simply loops? My understanding is RF that pconnect only benefits if a script would otherwise be using RF mysql_connect repeatedly--and this script doesn't, since it calls RF mysql_[p]connect() just once, in the start tof execution. RF ...Rene I have found pconnect to be a problem (several years ago) and have never tried it since, it may well be ok now. The most likely cause is memory consumption on long running php scripts, what does top say? I have a script which runs from cron and was hammering the system when it ran and i have had to put the usleep() in the while($result = ..) loop as there are a few thousand rows. Probably bad design but it works and I'm loath to touch it :) One way to solve the memory issue is to have the script started by inetd, slower but more memory friendly. Also have a look at memcached to reduce the load a bit. -- regards, Tom -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Jochem Maas wrote: Nathan Rixham wrote: Key I find though is multithreading, listener thread with stream_socket_server, 2 or 3 stream_socket_accept threads and a pair of new thread spawned to handle each connection (one to read, one to write) (not needed for stateless http style request processing). Nathan hi Nathan, any chance of a 'full blown' example for all the muppets who want to try and grok this stuff? (bork bork, say I :-)) I'd be interested to see how he does the multi-threading in php. Personally I'd always opt for C to write this type of thing, except for perhaps the most simple cases. /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Per Jessen wrote: Jochem Maas wrote: Nathan Rixham wrote: Key I find though is multithreading, listener thread with stream_socket_server, 2 or 3 stream_socket_accept threads and a pair of new thread spawned to handle each connection (one to read, one to write) (not needed for stateless http style request processing). Nathan hi Nathan, any chance of a 'full blown' example for all the muppets who want to try and grok this stuff? (bork bork, say I :-)) I'd be interested to see how he does the multi-threading in php. Personally I'd always opt for C to write this type of thing, except for perhaps the most simple cases. any chance of an example from you too? /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Jochem Maas wrote: I'd be interested to see how he does the multi-threading in php. Personally I'd always opt for C to write this type of thing, except for perhaps the most simple cases. any chance of an example from you too? Sure - http://jessen.ch/files/distripg_main.c It can't be compiled, but the pseudo-code goes like this: initialize bind() to address(es) to listen to start a number of threads (=workers) do until terminated poll() for new work if new_work(), accept(), queue it, then wake up the workers. done A worker thread: initialize do until terminated wait for work accept() process work done It wouldn't be too difficult to have threads dynamically started and stopped depending on the amount of work queued up. /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
That makes sense, but I'm not sure I really want to do this, since it's fairly important that Listener continue listening without interruption. I also don't think it's probably necessary, since from what I read, I'm not really pushing the envelope in terms of real load. Right now, I might have max ~250 clients connected, each sending 5-20 kb / day of data. It's not much data, nor many concurrent connections. If Jim's Listener handles 80-85k connections per day, then mine should be able to do 250 concurrently easily, and 2000 cumulative per day without a hitch. Did I mention, I'm on Mac OS X Server 10.4.11? Shouldn't matter, but anyway. On 10-Dec-07, at 5:48 PM, Jochem Maas wrote: Jim Lucas wrote: Tom Rogers wrote: Hi, ... Also, make sure you are not using an array that you are not re- initializing through each iteration of the loop. If the array keeps getting bigger, PHP might $*% on itself. Always re-initialize arrays to clean them up. even then he may still have creeping memory ... in which it might be possible to have a mother process that spawns and watchs a child process .. the child process is the actual deamon, the child could then keep a track of it's own memory usage and then kill itself when it gets too big ... the mother in turn would automatically spawn a new child deamon process upon seeing it's child has committed suicide. does that make sense? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
René Fournier wrote: However, the number of socket clients connecting in the past 3-4 months has steadily increased, and this seems to have exposed (if not created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure where the problem's cause really lies.) Basically, after the script has been running for a day or so (processing essentially the amount data that used to take two weeks), the CPU usage of the machine goes from 30% (normal) to 80-90%. Have you tried stracing it to see what's really happening when the load goes that high? /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Per Jessen wrote: René Fournier wrote: However, the number of socket clients connecting in the past 3-4 months has steadily increased, and this seems to have exposed (if not created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure where the problem's cause really lies.) Basically, after the script has been running for a day or so (processing essentially the amount data that used to take two weeks), the CPU usage of the machine goes from 30% (normal) to 80-90%. Have you tried stracing it to see what's really happening when the load goes that high? am I correct that that would be done like so?: strace -p process id of php deamon Im a little new to this kind of coolness, I found this page which is very helpful: http://www.ibm.com/developerworks/aix/library/au-unix-strace.html I was going to suggest using gdb to figure what the process is doing but figured it wouldn't be handy to have the deamon running via gdb for 48 hours ... silly me didn't grok that you can attach to an existing process (something that ibm article made clear to me) can't wait till my macbook arrives so that I can start to play with this kind of stuff locally ... aka another excuse to justify the aluminium beast ;-) /Per Jessen, Zürich -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Hello, I have a command-line PHP script--called Listener--that is designed to run indefinitely with a predictable CPU usage and memory footprint. In a nutshell, it's a multi-client socket server that waits for incoming connections, processes incoming data, stores results in a MySQL database, and basically gets on with its life. And it works. No errors or serious problems to speak of. And I've been running it for a couple years on an Xserve dual-G5 2GHz w/ OS X Server 10.4.11). Six months ago, the program would run for days, even a couple weeks, without a hitch. The only reason I would stop the script is for some other purpose, like a software update. However, the number of socket clients connecting in the past 3-4 months has steadily increased, and this seems to have exposed (if not created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure where the problem's cause really lies.) Basically, after the script has been running for a day or so (processing essentially the amount data that used to take two weeks), the CPU usage of the machine goes from 30% (normal) to 80-90%. This appears to be irrespective of the number of clients connected to the server at the time, but rather the amount of time the script has been running (and therefore cumulative cycles it's operated, data processed, MySQL queries executed, etc.). And the CPU usage stays high, even when the actual load (number of clients) decreases. At this time, if I run top, I get the following info: 22512 mysqld 91.6% 8:22:12 31 106 125 305M+ 3.20M 260M 475M 17104 php 7.2% 81:14.01 115 145 5.08M 7.80M 10.9M- 87.5M 22537 Terminal 6.6% 2:59:59 22 176 240 12.3M 21.2M 18.3M- 236M [...] At first I thought, okay, it's MySQL's fault. Maybe a lot of slow- running queries. But the slow query log is pretty clean. So maybe it's a combination of Mac OS X and MySQL and PHP? I Googled for a similar problem, and finally ran across this article: http://www.shawnhogan.com/2005/10/mysql-problems-on-mac-os-x-server.html ...where the author describes a very similar CPU usage pattern. I tried his suggested fixes and they seemed to have helped a little (or maybe it's my wishful thinking--hard to tell), since the high CPU load issue doesn't appear to happen as soon... But it still happens. Anyway, I'm really stumped as to what to do next, where to look, etc. If I stop the script, and restart it (but not the MySQL itself), CPU usage goes back to normal--for about a day or two. The only thing I thought might be connected is how many short-lived PHP child processes Listener creates--around 20-30,000 per day. Sounds higher, but on average it's just one every 2-3 seconds. Anyway, although the number of child processes isn't concurrent, would there be a problem with the number of historical child processes in view of ulimits or kern.maxfilesperproc? Anyway suggestions, tips, or links are much appreciated. Thanks. ...Rene -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
René Fournier wrote: Hello, I have a command-line PHP script--called Listener--that is designed to run indefinitely with a predictable CPU usage and memory footprint. In a nutshell, it's a multi-client socket server that waits for incoming connections, processes incoming data, stores results in a MySQL database, and basically gets on with its life. And it works. No errors or serious problems to speak of. And I've been running it for a couple years on an Xserve dual-G5 2GHz w/ OS X Server 10.4.11). Six months ago, the program would run for days, even a couple weeks, without a hitch. The only reason I would stop the script is for some other purpose, like a software update. However, the number of socket clients connecting in the past 3-4 months has steadily increased, and this seems to have exposed (if not created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure where the problem's cause really lies.) Basically, after the script has been running for a day or so (processing essentially the amount data that used to take two weeks), the CPU usage of the machine goes from 30% (normal) to 80-90%. This appears to be irrespective of the number of clients connected to the server at the time, but rather the amount of time the script has been running (and therefore cumulative cycles it's operated, data processed, MySQL queries executed, etc.). And the CPU usage stays high, even when the actual load (number of clients) decreases. At this time, if I run top, I get the following info: 22512 mysqld 91.6% 8:22:12 31 106 125 305M+ 3.20M 260M 475M 17104 php 7.2% 81:14.01 115 145 5.08M 7.80M 10.9M- 87.5M 22537 Terminal 6.6% 2:59:59 22 176 240 12.3M 21.2M 18.3M- 236M [...] At first I thought, okay, it's MySQL's fault. Maybe a lot of slow-running queries. But the slow query log is pretty clean. So maybe it's a combination of Mac OS X and MySQL and PHP? I Googled for a similar problem, and finally ran across this article: http://www.shawnhogan.com/2005/10/mysql-problems-on-mac-os-x-server.html ...where the author describes a very similar CPU usage pattern. I tried his suggested fixes and they seemed to have helped a little (or maybe it's my wishful thinking--hard to tell), since the high CPU load issue doesn't appear to happen as soon... But it still happens. Anyway, I'm really stumped as to what to do next, where to look, etc. If I stop the script, and restart it (but not the MySQL itself), CPU usage goes back to normal--for about a day or two. The only thing I thought might be connected is how many short-lived PHP child processes Listener creates--around 20-30,000 per day. Sounds higher, but on average it's just one every 2-3 seconds. Anyway, although the number of child processes isn't concurrent, would there be a problem with the number of historical child processes in view of ulimits or kern.maxfilesperproc? Anyway suggestions, tips, or links are much appreciated. Thanks. ...Rene I have a server that listens like yours does. I get 80k - 85k connections a day to it. When I first started it, I was only getting about 3k of connections aday. Then I upped the listening pattern and it tanked. I noticed that all my mail/web/db connections just sat there. When I investigated, I found that the number of connections to the server was being overloaded. So I increased the kern.maxfilesperproc setting to 32000. All the problems went away. I have about have the horse power you do, running OpenBSD 4.1, and it runs great now as my listener / web / ftp / mail / named / database / spam filter / etc... One question about the listener program, does it maintain a connection to the DB or does it open/close a connection upon each socket connection? If it does the latter, you might look into using a constant connection rather then opening/closing on a per connection basis. -- Jim Lucas Some men are born to greatness, some achieve greatness, and some have greatness thrust upon them. Twelfth Night, Act II, Scene V by William Shakespeare -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Hi Jim, I have a server that listens like yours does. I get 80k - 85k connections a day to it. When I first started it, I was only getting about 3k of connections aday. Then I upped the listening pattern and it tanked. I noticed that all my mail/web/db connections just sat there. When I investigated, I found that the number of connections to the server was being overloaded. So I increased the kern.maxfilesperproc setting to 32000. All the problems went away. I have about have the horse power you do, running OpenBSD 4.1, and it runs great now as my listener / web / ftp / mail / named / database / spam filter / etc... One question about the listener program, does it maintain a connection to the DB or does it open/close a connection upon each socket connection? If it does the latter, you might look into using a constant connection rather then opening/closing on a per connection basis. Thanks for the reply. (Not many people seem to be doing what I'm doing in the way I'm doing it... so I really appreciate the feedback.) I don't think the kern.maxfilesperproc setting is a problem. I'm currently set to 102400 per Shawn Hogan's advice. And my Listener program only receives ~1000 connections per day. However, each connection involves multiple MySQL queries, often as many as 50 or so per connection each day. So perhaps your second observation applies. MySQL is showing ~ 1,500 connections per hour. And I'm using mysql_connect() in Listener. I will see if using mysql_pconnect(), and reducing the number of connections helps. My.cnf's max connections, presently at 100, may have been too low in view using mysql_connect(). If mysql_pconnect() doesn't improve things, maybe I should bump up max_connections to 500? ..Rene -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Hi, Tuesday, December 11, 2007, 6:42:18 AM, you wrote: RF Hello, RF I have a command-line PHP script--called Listener--that is designed RF to run indefinitely with a predictable CPU usage and memory RF footprint. In a nutshell, it's a multi-client socket server that RF waits for incoming connections, processes incoming data, stores RF results in a MySQL database, and basically gets on with its life. And RF it works. No errors or serious problems to speak of. And I've been RF running it for a couple years on an Xserve dual-G5 2GHz w/ OS X RF Server 10.4.11). Six months ago, the program would run for days, even RF a couple weeks, without a hitch. The only reason I would stop the RF script is for some other purpose, like a software update. RF However, the number of socket clients connecting in the past 3-4 RF months has steadily increased, and this seems to have exposed (if not RF created) a strange performance issue with PHP 5.2.4, MySQL 5.0.45 RF and/or Mac OS X Server 10.4.11. (I say and/or because I am unsure RF where the problem's cause really lies.) Basically, after the script RF has been running for a day or so (processing essentially the amount RF data that used to take two weeks), the CPU usage of the machine goes RF from 30% (normal) to 80-90%. This appears to be irrespective of the RF number of clients connected to the server at the time, but rather the RF amount of time the script has been running (and therefore cumulative RF cycles it's operated, data processed, MySQL queries executed, etc.). RF And the CPU usage stays high, even when the actual load (number of RF clients) decreases. At this time, if I run top, I get the following RF info: RF 22512 mysqld 91.6% 8:22:12 31 106 125 305M+ 3.20M RF 260M 475M RF 17104 php 7.2% 81:14.01 115 145 5.08M 7.80M RF 10.9M- 87.5M RF 22537 Terminal 6.6% 2:59:59 22 176 240 12.3M 21.2M RF 18.3M- 236M RF [...] RF At first I thought, okay, it's MySQL's fault. Maybe a lot of slow- RF running queries. But the slow query log is pretty clean. So maybe RF it's a combination of Mac OS X and MySQL and PHP? I Googled for a RF similar problem, and finally ran across this article: RF http://www.shawnhogan.com/2005/10/mysql-problems-on-mac-os-x-server.html RF ...where the author describes a very similar CPU usage pattern. I RF tried his suggested fixes and they seemed to have helped a little (or RF maybe it's my wishful thinking--hard to tell), since the high CPU RF load issue doesn't appear to happen as soon... But it still happens. RF Anyway, I'm really stumped as to what to do next, where to look, etc. RF If I stop the script, and restart it (but not the MySQL itself), CPU RF usage goes back to normal--for about a day or two. RF The only thing I thought might be connected is how many short-lived RF PHP child processes Listener creates--around 20-30,000 per day. RF Sounds higher, but on average it's just one every 2-3 seconds. RF Anyway, although the number of child processes isn't concurrent, RF would there be a problem with the number of historical child RF processes in view of ulimits or kern.maxfilesperproc? RF Anyway suggestions, tips, or links are much appreciated. Thanks. RF ...Rene Put a usleep(1000) in the listen while() loop and give the cpu a break. -- regards, Tom -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
On 10-Dec-07, at 4:42 PM, Tom Rogers wrote: Put a usleep(1000) in the listen while() loop and give the cpu a break. Good advice, but I've already been doing that. The thing is, when the script first starts up, the CPU rarely exceeds 30%, even when many clients (200+) are simultaneously connected and sending data. When a few clients are connected, CPU is typically below 10%. Again, it's only after 24-48 hours that, all of a sudden, CPU usage increases by 40-50%. And it stays high until I stop the script and restart it. One question I have though is, is there actually any benefit to using mysql_pconnect(), since the script simply loops? My understanding is that pconnect only benefits if a script would otherwise be using mysql_connect repeatedly--and this script doesn't, since it calls mysql_[p]connect() just once, in the start tof execution. ...Rene -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Tom Rogers wrote: Hi, Tuesday, December 11, 2007, 6:42:18 AM, you wrote: RF Hello, Put a usleep(1000) in the listen while() loop and give the cpu a break. This makes me think about asking if you have to short of a timeout on your receiving connection? What are you using to setup your connection? fsockopen() or stream_socket_server() ? here is a snippet of what I have ?php $conn = mysql_connect(...); if ( $socket = @stream_socket_server('udp://'.LISTEN_IP.':'.LISTEN_PORT, $errno, $errstr, STREAM_SERVER_BIND) ) { while ( true ) { /* Get the exact same packet again, but remove it from the buffer this time. */ $buff = stream_socket_recvfrom($socket, 1024, 0, $remote_ip); # do stuff with your incoming data and mysql connection } fclose($socket); } mysql_close($conn); ? I don't have a timeout set on the *_recvfrom() call. I just wait until the next connection comes in. You don't need to use mysql_pconnect(), especially if you are using, what in essence is, a daemon. Just don't open and close the connection constantly, leave it open. Also, make sure you are not using an array that you are not re-initializing through each iteration of the loop. If the array keeps getting bigger, PHP might $*% on itself. Always re-initialize arrays to clean them up. Hope some of this helps! -- Jim Lucas Some men are born to greatness, some achieve greatness, and some have greatness thrust upon them. Twelfth Night, Act II, Scene V by William Shakespeare -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Jim Lucas wrote: Tom Rogers wrote: Hi, ... Also, make sure you are not using an array that you are not re-initializing through each iteration of the loop. If the array keeps getting bigger, PHP might $*% on itself. Always re-initialize arrays to clean them up. even then he may still have creeping memory ... in which it might be possible to have a mother process that spawns and watchs a child process .. the child process is the actual deamon, the child could then keep a track of it's own memory usage and then kill itself when it gets too big ... the mother in turn would automatically spawn a new child deamon process upon seeing it's child has committed suicide. does that make sense? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
On 10-Dec-07, at 5:20 PM, Jim Lucas wrote: Tom Rogers wrote: Hi, Tuesday, December 11, 2007, 6:42:18 AM, you wrote: RF Hello, Put a usleep(1000) in the listen while() loop and give the cpu a break. This makes me think about asking if you have to short of a timeout on your receiving connection? One second on stream_socket_server(), with a 900 second timeout stream_socket_accept(). What are you using to setup your connection? fsockopen() or stream_socket_server() ? here is a snippet of what I have [...] Thanks for sharing your code. Seems pretty similar to mine at first glance. I don't have a timeout set on the *_recvfrom() call. I just wait until the next connection comes in. You don't need to use mysql_pconnect(), especially if you are using, what in essence is, a daemon. Yeah, after thinking about it, that's what I figured. Thanks for confirming though. Just don't open and close the connection constantly, leave it open. Yes, I just open it once at the top of the script. And that's it. Also, make sure you are not using an array that you are not re- initializing through each iteration of the loop. If the array keeps getting bigger, PHP might $*% on itself. Always re-initialize arrays to clean them up. Hope some of this helps! All good advice. I will check my arrays, although I don't think this is a problem since I monitor the scripts memory usage with memory_get_usage() and memory_get_peak_usage(), and it never tops 3MB (max allocated is 16MB). There are a few little parts to the daemon. One thing I'm doing that could be problematic is running an include (); on a couple files each time a socket has new data (this allows me to adjust the processing logic on the fly without having to start the script and wait for clients to reconnect)--but I can see this being expensive in terms of performance and resources. Actually, I wonder if THAT is not what's starving the script of resources over time-- each fread() involves several includes(); I'll have to look into that... FWIW, here's the stripped-down skeleton of the server: As always, constructive criticism is very welcome. ?php $socket = stream_socket_server(tcp://127.0.0.1:9876, $errno, $errstr); if ($socket) { $master[] = $socket; $read = $master; $write = $master; while (1) { $read = $master; $write = $master; $mod_fd = stream_select($read, $_w = NULL, $_e = NULL, 1); if ($mod_fd === FALSE) { break; } for ($i = 0; $i $mod_fd; ++$i) { if ($read[$i] === $socket) {// NEW SOCKET $conn = stream_socket_accept($socket, 900); $master[] = $conn; $key_num = array_search($conn, $master, TRUE); } else { $sock_data = fread($read[$i], 32768); if (strlen($sock_data) === 0) { // CONNECTION GONE $key_to_del = array_search($read[$i], $master, TRUE); fclose($read[$i]); unset($master[$key_to_del]); } elseif ($sock_data === FALSE) { // CONNECTION BROKEN $key_to_del = array_search($read[$i], $master, TRUE); fclose($read[$i]); unset($master[$key_to_del]); } else { // READ INCOMING DATA // include (somefiles); // include (somefiles); // include (somefiles); // [ ... ] } } } } } ? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Jochem Maas wrote: Jim Lucas wrote: Tom Rogers wrote: Hi, ... Also, make sure you are not using an array that you are not re-initializing through each iteration of the loop. If the array keeps getting bigger, PHP might $*% on itself. Always re-initialize arrays to clean them up. even then he may still have creeping memory ... in which it might be possible to have a mother process that spawns and watchs a child process .. the child process is the actual deamon, the child could then keep a track of it's own memory usage and then kill itself when it gets too big ... the mother in turn would automatically spawn a new child deamon process upon seeing it's child has committed suicide. does that make sense? it may ... but it's besides the point ... apparently I'm having difficulty even reading the subject of [EMAIL PROTECTED] :-P -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
René Fournier wrote: FWIW, here's the stripped-down skeleton of the server: As always, constructive criticism is very welcome. ?php $socket = stream_socket_server(tcp://127.0.0.1:9876, $errno, $errstr); if ($socket) { $master[] = $socket; $read = $master; $write = $master; while (1) { $read = $master; $write = $master; The follow part, I think, is where your problem is. This line tells the system to wait 1 second and then continue, whether you have an inbound connection or not. $mod_fd = stream_select($read, $_w = NULL, $_e = NULL, 1); Then here you are testing for success or failure of the last call. if ($mod_fd === FALSE) { break; } Problem, if you don't have a connection, then the will fail constantly. once every second X (times) the number of connections you have... for ($i = 0; $i $mod_fd; ++$i) { if ($read[$i] === $socket) {// NEW SOCKET $conn = stream_socket_accept($socket, 900); $master[] = $conn; $key_num = array_search($conn, $master, TRUE); } else { $sock_data = fread($read[$i], 32768); if (strlen($sock_data) === 0) { // CONNECTION GONE $key_to_del = array_search($read[$i], $master, TRUE); fclose($read[$i]); unset($master[$key_to_del]); } elseif ($sock_data === FALSE) {// CONNECTION BROKEN $key_to_del = array_search($read[$i], $master, TRUE); fclose($read[$i]); unset($master[$key_to_del]); } else {// READ INCOMING DATA Here, you are not removing the successful connections from $master, so it keeps growing on and on... As for the includes, well, I would turn them into function calls and then have a different function for the different ways you want the program to react. // include (somefiles); // include (somefiles); // include (somefiles); // [ ... ] } } } } } ? I didn't see anything about your DB connections. Are those located in the includes? How long, on average, does your processing of the incoming data take? Because, to me, it looks like you might have a blocking problem with the in-coming connections. If the initial connection takes too long, then the following connections might be getting blocked. You might want to look into pcntl_* functions to do forking if you need it. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re[2]: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Hi, Tuesday, December 11, 2007, 10:01:38 AM, you wrote: RF On 10-Dec-07, at 4:42 PM, Tom Rogers wrote: Put a usleep(1000) in the listen while() loop and give the cpu a break. RF Good advice, but I've already been doing that. The thing is, when the RF script first starts up, the CPU rarely exceeds 30%, even when many RF clients (200+) are simultaneously connected and sending data. When a RF few clients are connected, CPU is typically below 10%. Again, it's RF only after 24-48 hours that, all of a sudden, CPU usage increases by RF 40-50%. And it stays high until I stop the script and restart it. RF One question I have though is, is there actually any benefit to using RF mysql_pconnect(), since the script simply loops? My understanding is RF that pconnect only benefits if a script would otherwise be using RF mysql_connect repeatedly--and this script doesn't, since it calls RF mysql_[p]connect() just once, in the start tof execution. RF ...Rene I have found pconnect to be a problem (several years ago) and have never tried it since, it may well be ok now. The most likely cause is memory consumption on long running php scripts, what does top say? I have a script which runs from cron and was hammering the system when it ran and i have had to put the usleep() in the while($result = ..) loop as there are a few thousand rows. Probably bad design but it works and I'm loath to touch it :) One way to solve the memory issue is to have the script started by inetd, slower but more memory friendly. Also have a look at memcached to reduce the load a bit. -- regards, Tom -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Thanks Jim. Several good points here that I will look into. I've already moved the include() bits into function calls. (That's simple thing I should have corrected long ago.) The socket areas though I'm less sure about how to adjust, since networking programming isn't something I grok naturally. On 10-Dec-07, at 9:57 PM, Jim Lucas wrote: René Fournier wrote: FWIW, here's the stripped-down skeleton of the server: As always, constructive criticism is very welcome. ?php $socket = stream_socket_server(tcp://127.0.0.1:9876, $errno, $errstr); if ($socket) { $master[] = $socket; $read = $master; $write = $master; while (1) { $read = $master; $write = $master; The follow part, I think, is where your problem is. This line tells the system to wait 1 second and then continue, whether you have an inbound connection or not. $mod_fd = stream_select($read, $_w = NULL, $_e = NULL, 1); Then here you are testing for success or failure of the last call. if ($mod_fd === FALSE) { break; } Problem, if you don't have a connection, then the will fail constantly. once every second X (times) the number of connections you have... But my understanding from the docs (where I used one of the examples as a template for the script) socket is that it could/would only fail on startup, that is if it can't perform stream_select() because it's unable to bind to with stream_socket_server(), or am I wrong?: If the call fails, it will return FALSE and if the optional errno and errstr arguments are present they will be set to indicate the actual system level error that occurred in the system-level socket(), bind(), and listen() calls. If the value returned in errno is 0 and the function returned FALSE, it is an indication that the error occurred before the bind() call. This is most likely due to a problem initializing the socket. Note that the errno and errstr arguments will always be passed by reference. (http://www.php.net/manual/en/ function.stream-socket-server.php) In other wrongs, mod_fd can't return FALSE once the socket_server has been created and bound to the specified IP... right? for ($i = 0; $i $mod_fd; ++$i) { if ($read[$i] === $socket) {// NEW SOCKET $conn = stream_socket_accept($socket, 900); $master[] = $conn; $key_num = array_search($conn, $master, TRUE); } else { $sock_data = fread($read[$i], 32768); if (strlen($sock_data) === 0) { // CONNECTION GONE $key_to_del = array_search($read [$i], $master, TRUE); fclose($read[$i]); unset ($master[$key_to_del]); } elseif ($sock_data === FALSE) {// CONNECTION BROKEN $key_to_del = array_search ($read[$i], $master, TRUE); fclose($read[$i]); unset($master[$key_to_del]); } else {// READ INCOMING DATA Here, you are not removing the successful connections from $master, so it keeps growing on and on... But only until the connection closes, or no longer blocks (goes away), in which case the program fcloses that socket and removes it from master[]. As for the includes, well, I would turn them into function calls and then have a different function for the different ways you want the program to react. Yes, I did that. It helps a bit with CPU. // include (somefiles); // include (somefiles); // include (somefiles); // [ ... ] } } } } } ? I didn't see anything about your DB connections. Are those located in the includes? Just at header.inc, once. How long, on average, does your processing of the incoming data take? Because, to me, it looks like you might have a blocking problem with the in-coming connections. If the initial connection takes too long, then the following connections might be getting blocked. You might want to look into pcntl_* functions to do forking if you need it. The processing is pretty quick. I don't think that's a bottleneck. It basically just inserts the data into MySQL, not much processing actually. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: Re[2]: [PHP] Command-line PHP script CPU usage goes sky-high, stays there--why?
Curiously, would you agree with this guy's comments concerning low- level PHP socket functions vs stream_socket_server() ? If you want a high speed socket server, use the low-level sockets instead (socket_create/bind/listen). The stream_socket_server version appears to have internal fixed 8k buffers that will overflow if you don't keep up by reading. This is a serious problem if you an application that reads the socket for messages and then, say, saves the result in a database. The delay while it is busy processing means you can't read the data in time unless you get involved in muti-threading. With the the low-level functions, the OS quietly buffers TCP/IP packets so there is no problem (tested on Windows XP Professional). (http://www.php.net/manual/en/function.stream-socket- server.php#67837) On 10-Dec-07, at 9:46 PM, Tom Rogers wrote: Hi, Tuesday, December 11, 2007, 10:01:38 AM, you wrote: RF On 10-Dec-07, at 4:42 PM, Tom Rogers wrote: Put a usleep(1000) in the listen while() loop and give the cpu a break. RF Good advice, but I've already been doing that. The thing is, when the RF script first starts up, the CPU rarely exceeds 30%, even when many RF clients (200+) are simultaneously connected and sending data. When a RF few clients are connected, CPU is typically below 10%. Again, it's RF only after 24-48 hours that, all of a sudden, CPU usage increases by RF 40-50%. And it stays high until I stop the script and restart it. RF One question I have though is, is there actually any benefit to using RF mysql_pconnect(), since the script simply loops? My understanding is RF that pconnect only benefits if a script would otherwise be using RF mysql_connect repeatedly--and this script doesn't, since it calls RF mysql_[p]connect() just once, in the start tof execution. RF ...Rene I have found pconnect to be a problem (several years ago) and have never tried it since, it may well be ok now. The most likely cause is memory consumption on long running php scripts, what does top say? I have a script which runs from cron and was hammering the system when it ran and i have had to put the usleep() in the while($result = ..) loop as there are a few thousand rows. Probably bad design but it works and I'm loath to touch it :) One way to solve the memory issue is to have the script started by inetd, slower but more memory friendly. Also have a look at memcached to reduce the load a bit. -- regards, Tom -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php