At 10:01 27/03/2001 +0200, Kamphuys, ing. K.G. wrote:
>Hi all,
>
>I have a question regarding opening very large webserver logfiles.  They are
>about a gigabyte each and I have seven of them so I run out of memory.

Have you tried reading line-by-line?

>This is what I do now:
>
>for $file (@logfiles) {
>  open (FILE, "$file");

while (<FILE>) {
    # $_ holds a line of your file
)

  }

>Though I try to save memory by deleting each line that isn't necessary
>anymore (line 6) I still swallow up a complete file at a time, so I need
>memory for both the logfile and the growing data structure of analyzed
>stuff.
>
>Now I've seen sometime that there is a more efficient way to open large
>files, but I cannot remember anymore.  Who can help me out?
>
>Thanks in advance,
>
>Koen Kamphuys
>Web master of, during foot and mouth disease crisis, the most successful
>site of Dutch government, http://www.minlnv.nl/
>_______________________________________________
>Perl-Win32-Users mailing list
>[EMAIL PROTECTED]
>http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users


_______________________________________________
Perl-Win32-Users mailing list
[EMAIL PROTECTED]
http://listserv.ActiveState.com/mailman/listinfo/perl-win32-users

Reply via email to