Hello Smain,

On Tue, May 07, 2013 at 09:32:17AM +0200, Smain Kahlouch wrote:
> Hello Lukas,
> 
> Thanks for your answer.
> Our backend is a webserver providing streaming services.
> 
> >From my understanding, the user won't be directed to another backend. It
> will wait till a connection is freed :
> 
> "Limits the sockets to this number of concurrent connections. Extraneous
> connections will remain in the system's backlog until a connection is
> released. "

This is about the frontend's maxconn. The server maxconn moves the request
to one of two possible queues :
  - the server's own queue if the request must be handled exclusively by
    this server (eg: cookie, hash, ...)
  - the backend's queue otherwise

When the backend queue is selected, any other server with spare slots
can pick the request. So yes, the load will spread over all servers
ensuring that they're always used at full throttle and not more.

Willy


Reply via email to