Hi all,

We use a nginx ->  haproxy (1.3.20 currently) -> mongrels setup to serve our
website and recently we noticed an issue where haproxy is returning a 400
error for requests with very long headers (for us typically long url, long
referrer, or the combination of both).  I checked the change logs and didn't
see anything that might address this in newer versions. While I haven't
narrowed down the exact limit yet it's around 6500 bytes.  This occurs only
when the config file has mode http, in mode tcp the request goes through
fine.  So perhaps related to some buffer limit in the http parsing? If so is
the limit documented some where and is there a setting to increase it?
While long urls are generally not a good idea, we don't always have full
control over the referrer and for a couple of reasons we'd like to support
very long urls in a few parts of our site.

TIm

Reply via email to