Blair Zajac <[EMAIL PROTECTED]> writes:

> Gisle Aas wrote:
> >
> > It is not clear what the best thing to do when you find a bad header
> > line or reach one of the limits is.  Some alternatives are:
> > 
> >    - Just die (LWP will turn it into a 500 response but all headers
> >      and content will be lost)  This is what Net::HTTP currently does.
> >      This is seems not to be acceptable.
> > 
> >    - Stop reading headers; the bad header line and any headers following
> >      it will show up as data in the content of the response.
> > 
> >    - Continue reading until the header count limit is reached; we risk
> >      that part of the content will end up in headers and that some
> >      leading bytes of the content will be chopped off.
> > 
> > We could do a mix of the later two based on how bad the header is or
> > perhaps there are other strategies.  Comments?
> 
> I think we'll need to do both, since this solution by itself doesn't address
> the problem with a duplicate HTTP status line header problems for the URL
> 
> http://cgi.liveauctions.ebay.com/aw-cgi/eBayISAPI.dll?ViewItem&item=5974873
> 
> Or, if we find a bad header, then we can check header for size limits after
> that.

If we assume that duplicate HTTP status line is a common error then it
might make sense to special case it somehow.  Does that URL still give
you bad header.  It does not for me.

Regards,
Gisle

> > Anybody knows what Mozilla, Opera and/or MSIE does with bad headers?
> 
> For this URL, I checked Mozilla, Opera, Netscape and IE display the page, so
> they may just look for two \015?\012.
> 
> > 
> > We probably also need to ignore any Transfer-Encoding headers since we
> > will probably not manage to get in sync with the content they describe.
> 
> For the above case, you would ignore perfectly fine headers.

Reply via email to