Our servers do not serve any static content, it is entirely application content 
for a mobile application, so we have many many requests that need to run 
through a php server. 

The issue we have is that our scripts are dependent on external resources, so 
php execution time can vary wildly. 


Can you elaborate on this comment:
> -do not run too much PHPs, 2xnumber of cores is ok coz if u run too much 
> content switching will eat your CPU and RAM is better used for caching, not 
> another 20 PHP processes ;]
Given that we are using mod_php, does this still make sense or is this only 
relevant to the fastcgi? What it 2x the number of cores? Do you mean processor 
cores? We need to be able to concurrently handle as many PHP processes as 
possible. 

Thanks for taking the time to help out and explain your configuration. 





On Dec 4, 2009, at 2:50 PM, XANi wrote:

> 
> Dnia 2009-12-04, piÄ… o godzinie 14:30 -0500, Naveen Ayyagari pisze:
>> 
>> We are running mod_php on the apache servers.   And we have our connection 
>> limit set to what we consider fairly low in haproxy.. The problem i am 
>> describing is more an issue with the number of processes executing on the 
>> backend machine. I guess we had assumed if we set maxconn to a  number that 
>> no more than that many connections would ever be served by the backend 
>> server at any given point. However, we see that if we have a process that 
>> takes a while to execute on the backend, that haproxys 'timeout server' 
>> drops the connection and serves up the next one in the haproxy queue, but 
>> the original request is still processing on the backend because apache did 
>> not kill it and wont kill it..
>> 
>> 
>> So what ends up happening is the server gets overloaded with additional new 
>> connections, because it is busy processing requests that haproxy has already 
>> decided to stop listening for. 
>> 
>> 
>> I would like to see apache just stop processing when haproxy drops the 
>> connection when it hits the 'timeout server' value, such that unneeded 
>> processing doesn't continue on the backend. 
>> 
> use "reply all" please ;p Do you have separate backend for static content ? 
> mod_php is kinda bad when it comes to serving both static and dynamic content 
> from same machine, thats because even when you request 10 byte gif apache 
> will use big heavy process with mod_php loaded, mod_fcgi (or mod_fastcgi) + 
> php-cgi is usually much better (and u can use mpm_worker which is faster and 
> handes more connections too). 
> 
> What I have on my setup is:
> -server which serves dynamic content have 2xnumber of cores php processes 
> thru fastcgi, haproxy have connlimit per server a bit higher than php 
> processes (say php_processes*1.1) so server dont have to wait for new request
> -server which serves have much higher conn. limit (like 100 or 200) coz it 
> usually have it in cache anyway
> 
> so basically what i do is:
> -do not run too much PHPs, 2xnumber of cores is ok coz if u run too much 
> content switching will eat your CPU and RAM is better used for caching, not 
> another 20 PHP processes ;]
> -limit number of connections to "dynamic" servers so there isn't 40 requests 
> waiting to be handled but 2-10 max and rest queued on harproxy (another plus 
> side is if one server is fully loaded it doesn't "hog" connections but it 
> lets haproxy requeue them
> -separate  php from apache (or better, just use lighttpd/nginx)
> 
> Regards 
> Mariusz


Reply via email to