>     Hi. I am using RedHat Enterprise Linux, with (alleged) support for
> +2Gb files. The filesystem itself seems to be able to handle this big
> files, since I can gzip-d archives to files much larger than 2Gb. BUT,
> when catting together two files that  end up larger than 2Gb in size,
> the following happens:
> 
>     cat: write error: File too large
> 
> The same happens if I redirect the output of a program into a file that
> ends up larger than 2Gb. Any idea on how to solve this? I find it hard
> to believe that this is an error in the filesystem since I could ungzip
> huge files. OR, did RedHat screw these mods of theirs up somehow?

I can confirm your findings on a Mandrake system.  I looked at the
package spec file for Redhat sh-utils-2.0-1 and found this in their
configuration:

CFLAGS="$RPM_OPT_FLAGS" ../configure --prefix=/usr --disable-largefile --enable-pam

Which looks to me like they are explicitly disabling large file
support.

Jim has previously written:
> There are some bugs in fileutils-4.0.
> I suggest you use the latest test release instead:
> 
>   ftp://alpha.gnu.org/gnu/fetish/fileutils-4.0x.tar.gz

fileutils-4.0z.tar.gz is the latest test version.  

Bob
_______________________________________________
Bug-textutils mailing list
[EMAIL PROTECTED]
http://mail.gnu.org/mailman/listinfo/bug-textutils

Reply via email to