Chris Wagner wrote:
At 02:46 PM 9/26/05 -0400, Craig Cardimon wrote:
I am reading in portions of large files into string variables. The
number of and size of the files vary by month. Right now I'm working
with nearly 70,000 files, the largest of which is 7 GB. Not all files
are processed, but all are scanned line by line for keywords. If a
keyword is found, the entire file is read in for section by section
processing.
So I take it that it's the "section processing" that is exhausting ur
memory? It seems like this exact same issue came up on the list a few
months ago. Scanning huge files for keywords. Did u learn anything from
that? I need to see ur code to be of any more help.
I believe it was called "chunk processing," or "chunking," or something
like that, wasn't it?
I didn't need it at the time. Now the files are numerous enough and big
enough that I may very well have to use it.
-- Craig
---
avast! Antivirus: Outbound message clean.
Virus Database (VPS): 0539-0, 09/26/2005
Tested on: 9/27/2005 9:29:18 AM
avast! - copyright (c) 1988-2004 ALWIL Software.
http://www.avast.com
_______________________________________________
Perl-Win32-Users mailing list
Perl-Win32-Users@listserv.ActiveState.com
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs