Hi Lukas,
  I know the response is crappy like Baptiste said.
  But as a reverse proxy, nginx works OK for this website, it would be
better if haproxy also works for such website.

The debug output of wget is:


---request begin---
GET / HTTP/1.0
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64; rv:26.0) Gecko/20100101
Firefox/26.0
Accept: */*
Host: www.abc.com
Connection: Keep-Alive

---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.1 200 OK
Content-Length: 355
Content-Type: text/html
Content-Location: http://www.abc.com/
Last-Modified: Tue, 22 Mar 2011 04:46:53 GMT
Accept-Ranges: bytes
ETag: "c8167c2a4ce8cb1:8d3"
XXXXXXXXXXXXXXXXXXXXXXXXX
MicrosoftOfficeWebServer: 5.0_Pub
XXXXXXXXXXXXXXXXXXXXX
Date: Mon, 30 Dec 2013 06:00:59 GMT
Connection: keep-alive
Set-Cookie:FORTIWAFSID=HG2BQ1JRP4SFFDCJJFHVBUUY0NG7SD0Z;Path=/

---response end---
200 OK





BR,
DeltaY









2013/12/31 Lukas Tribus <[email protected]>

> Hi,
>
>
> > HTTP/1.1 200 OK
> > Date: Mon, 30 Dec 2013 05:40:02 GMT
> > XXXXXXXXXXXXXXXXXXXXXXXXX
> > MicrosoftOfficeWebServer: 5.0_Pub
> > XXXXXXXXXXXXXXXXXXXXX
> > XXXXXXXXXXXXXXXXXXXXXXXXXXX
> >
> > Cache-Control: private
> > Content-Type: text/html; charset=utf-8
> > Content-Length: 73803
> >
> > <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
>
> If this really is the response as is, then I'm not wondering that
> its not passing through haproxy.
>
> Not only are the headers totally invalid, a HTTP user-agent or
> proxy would parse everything starting with Cache-Control as
> payload.
>
> Get your backend fixed, really. Very likely your $vendor already
> has a fix for this (this is so bogus, a browser doesn't even see
> the Content-Length and thus cannot do any keep-alive).
>
>
>
> Regards,
>
> Lukas

Reply via email to