Hello all,
I'm having problems loading big files into memory - maybe you could help
me solve them.
My data file is a big (~250MB) text file, with eight tab-separated
fields. I want to load the entire file into a list.
I've narrowed down the code into this:
-------------
#!/usr/bin/perl
use strict;
use warnings;
use Data::Dumper;
use Devel::Size qw (size total_size);
my @probes;
while (<>) {
my @fields = split(/\s+/);
push @probes, [EMAIL PROTECTED];
}
print "size = ", size([EMAIL PROTECTED]),"\n";
print "total size= ", total_size([EMAIL PROTECTED]),"\n";
print "data size = ", total_size([EMAIL PROTECTED])- size([EMAIL
PROTECTED]),"\n";
print Dumper([EMAIL PROTECTED]),"\n";
------------
(Can't get any simpler than that, right?)
But when I run the program, the perl process consumes 2.5GB of memory,
prints "out of memory" and stops.
I know that perl isn't the most efficient memory consumer, but surely
there's a way to do it...
If you care to test it yourselves, here's a simple script that creates a
dummy text file, similar to my own data file:
-----
#!/usr/bin/perl
foreach (1..2100000) { print join("\t", "LONG-TEXT-FIELD", 11111,
222222, 3333333, 44444444, 5555555, 6666666,
"VERY-VERY-VERY-VERY-VERY-VERY-VERY-VERY-VERY-LONG-TEXT-FIELD" ),"\n" ; }
-----
Thanks in advance for your help!
Assaf.
_______________________________________________
Perl mailing list
[email protected]
http://perl.org.il/mailman/listinfo/perl