Hi. I am using RedHat Enterprise Linux, with (alleged) support for
+2Gb files. The filesystem itself seems to be able to handle this big
files, since I can gzip-d archives to files much larger than 2Gb. BUT,
when catting together two files that  end up larger than 2Gb in size,
the following happens:

    cat: write error: File too large

The same happens if I redirect the output of a program into a file that
ends up larger than 2Gb. Any idea on how to solve this? I find it hard
to believe that this is an error in the filesystem since I could ungzip
huge files. OR, did RedHat screw these mods of theirs up somehow?

   Thanks, In hope of quick response...


                                            Gudmundur, Iceland

_______________________________________________
Bug-textutils mailing list
[EMAIL PROTECTED]
http://mail.gnu.org/mailman/listinfo/bug-textutils

Reply via email to