I was reading files first into an array and then into a scalar variable for different kinds of processing.

This worked well until the files began growing in size. The largest I've encountered so far is about 45 MB. When I try reading such a file into memory, well, things get ugly fast.

I tried using Tie::File, but that module changes the source files. I need them to remain pristine.

I was reading the entire file into memory because my code contains lines such as the following:

while($wholefile =~ /(drilling|mineral|land) rights/gi)

This construct is really efficient for counting occurrences globally.

Looks like I will have to rewrite the script to process files line by line, unless I stumble upon a miracle module on CPAN.

If anyone knows of an efficient way to process large files line by line, other than the standard "|while( <FH> )|", please let me know.

-- Craig

_______________________________________________
ActivePerl mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to