DO NOT REPLY TO THIS EMAIL, BUT PLEASE POST YOUR BUG RELATED COMMENTS THROUGH THE WEB INTERFACE AVAILABLE AT <http://nagoya.apache.org/bugzilla/show_bug.cgi?id=22827>. ANY REPLY MADE TO THIS MESSAGE WILL NOT BE COLLECTED AND INSERTED IN THE BUG DATABASE.
http://nagoya.apache.org/bugzilla/show_bug.cgi?id=22827 Content-Length is a signed int. Summary: Content-Length is a signed int. Product: Apache httpd-1.3 Version: 1.3.28 Platform: Other OS/Version: Other Status: NEW Severity: Normal Priority: Other Component: core AssignedTo: [email protected] ReportedBy: [EMAIL PROTECTED] I'm assuming this is a concious choice (that it's not fixed in 1.3.x), but as i did not find the bug in your database, I figured I would report it... and get the offical answer. if this is "not to be fixed" it would be great to add it to the FAQ. So the problem is, Content-Length in 1.3.x is a signed int. If I remember from the HTTP 1.1 spec, Content-Length is actually unsigned... or at least negative values have no meaning. The signed-int limitation causes apache 1.3.x to return negative content lenghts for files over 2 gig in size. wget handles this OK, but curl and many browsers do not. (We've had to hack our version of curl to handle these large negatives.) My proposed "hack" would be to typecast the signed int as an unsigned int, before printing, but I imagine you all have many good reasons not to do this... I would just love to hear them. (If that were to happen, suddenly 1.3.x would support 2 - 4 gig files correctly.) Anyway, thanks for your time. -eric p.s. Should I file a feature request to Apache 2.x that they use something larger than an unsigned int to hold content length? My last reading through the code suggested that 2.x would also break at the 4 gig barrier which we are rapidly approaching (at least hear at Apple). --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
