http://nagoya.apache.org/bugzilla/show_bug.cgi?id=1053
*** shadow/1053 Tue Mar 20 11:47:51 2001 --- shadow/1053.tmp.27302 Tue Mar 20 11:47:51 2001 *************** *** 0 **** --- 1,31 ---- + +============================================================================+ + | FixCLRF in 1.2 and 1.3 breaks on very large files | + +----------------------------------------------------------------------------+ + | Bug #: 1053 Product: Ant | + | Status: NEW Version: 1.2 | + | Resolution: Platform: PC | + | Severity: Major OS/Version: Windows NT/2K | + | Priority: Medium Component: Core tasks | + +----------------------------------------------------------------------------+ + | Assigned To: [EMAIL PROTECTED] | + | Reported By: [EMAIL PROTECTED] | + | CC list: Cc: | + +----------------------------------------------------------------------------+ + | URL: | + +============================================================================+ + | DESCRIPTION | + FixCRLF pre-loads a file before processing it; that doesn't work on huge files. + You'll get a java.lang.OutOfMemoryError during execution of the task. + + Unfortunatley the implementation of FixCRLF has private variables for the + properties it establishes via its setters (without corresponding getters), so + you can't subclass FixCRLF to easily repair the problem. The fix must be made + to the FixCRLF source. + + This problem exists in both Ant 1.2 and 1.3; I doubt it is platform-specific. + The size of file you can process will vary with the OS and available memory + hardware and with the JVM (I've seem some inconvenient internal array size + limits in various JVMs, but I don't know if that is a factor in this bug). In + my case I was trying to process a 39meg file with 320meg RAM on an NT 4.0 + laptop (with assorted development tools running while using Ant) and JDK 1.3.0- + C.
