On Thu, Nov 16, 2017 at 05:30:05PM +0100, Tim Düsterhus wrote:
> William,
>
> Am 15.11.2017 um 21:17 schrieb William Lallemand:
> > These problems have been fixed in the master with the following commits:
> >
> > 75ea0a06b BUG/MEDIUM: mworker: does not close inherited FD
> > fade49d8f BUG/MEDIUM:
Hello,
You can avail Flat 50% Off on all Airlines & Hotels.
There are no hidden costs etc.
We're able to offer you such a high discount because we buy airline miles and travel vouchers from market at very low rates.
Limited Promotion. Call Toll Free (855) 425-6766
Thanks,
Patrick
Travel US
On Thu, Nov 16, 2017 at 06:33:35PM +0100, Olivier Houchard wrote:
> Hi,
>
> The first patch attempts fo fix session resumption with TLS 1.3, when
> haproxy acts as a client, by storing the ASN1-encoded session in the struct
> server, instead of storing the SSL_SESSION *directly. Directly keeping
>
Hi,
The first patch attempts fo fix session resumption with TLS 1.3, when
haproxy acts as a client, by storing the ASN1-encoded session in the struct
server, instead of storing the SSL_SESSION *directly. Directly keeping
SSL_SESSION doesn't seem to work well when concurrent connections are made
us
William,
Am 15.11.2017 um 21:17 schrieb William Lallemand:
> These problems have been fixed in the master with the following commits:
>
> 75ea0a06b BUG/MEDIUM: mworker: does not close inherited FD
> fade49d8f BUG/MEDIUM: mworker: does not deinit anymore
> 2f8b31c2c BUG/MEDIUM: mworker: wait again
Try
proxy_buffering off;
proxy_request_buffering off;
in nginx
On Nov 15, 2017 8:01 PM, "omer kirkagaclioglu" wrote:
Hi,
I just put a service that has around 400 - 4K http / https mixed requests
per second behind haproxy. The endpoint with the highest rate of requests
is a POST request with
2017-11-16 16:24 GMT+01:00 omer kirkagaclioglu :
> Hi Lukas,
>
> Thanks for the quick answer. I am using haproxy on another service which
> consists of GET requests with very small query parameters. It load balances
> to a backend with 4 servers with 3K-20K requests per second. This time I
> see 3
Hi Lukas,
Thanks for the quick answer. I am using haproxy on another service which
consists of GET requests with very small query parameters. It load balances
to a backend with 4 servers with 3K-20K requests per second. This time I
see 3400K goroutines waiting for reading the request, although th
8 matches
Mail list logo