DO NOT REPLY TO THIS EMAIL, BUT PLEASE POST YOUR BUG 
RELATED COMMENTS THROUGH THE WEB INTERFACE AVAILABLE AT
<http://issues.apache.org/bugzilla/show_bug.cgi?id=28898>.
ANY REPLY MADE TO THIS MESSAGE WILL NOT BE COLLECTED AND 
INSERTED IN THE BUG DATABASE.

http://issues.apache.org/bugzilla/show_bug.cgi?id=28898

Large file support  (> 2GB) for platforms w/ 32-bit size_t and 64-bit off_t





------- Additional Comments From [EMAIL PROTECTED]  2004-05-15 05:26 -------
> $ HEAD http://localhost:8900/big/bigfile | grep Content-Length
> Content-Length: 3145728000

I don't think it's big enough.  It's definitely larger than MAX_INT but well
within UMAX_INT, which nicely fits in 32-bit.

3145728000 = 0xbb80_0000 < 0xffff_ffff

FYI to make large files quickly (and sparsely), you can go like

  perl -e 'truncate shift, shift' file size

or

  truncate -s size file

if your platform has trucate(1).

> BTW, your HEAD test was with 2.0, I presume.

Correct. 1.3.x hardcodes them all "long" while 2.0.x uses apr_*.

Dan the Truncated Man

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to