I cut and paste your code. Used a file 1070725103 (~1 Gb) bytes big with
about 100 bytes per line.
and it ran in split second.

Are the lines in your file giant sized (>1Mb/line)?

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, March 13, 2002 11:20 AM
To: [EMAIL PROTECTED]
Subject: Processing Large Files


Friends,

I need to process 300+ MB text files. I tried to open one such file; read 10
lines (line by line) and output those 10 lines to a new text file. Despite
reading line by line, it was taking a very long time for processing. Is
there anything that I can try to speed it up!!

Here is my pronto code:

###########################
#!/usr/bin/perl
use strict;
my $from        = "humongous.txt";
open(INP, $from) or die "cannot open input file $!";
open(OUT, ">sample.txt") or die "cannot open input file $!";
print("trying to read input file\n");
while(<INP>){
        print STDOUT;
        print OUT;
        last if($. >= 10);
}
close(INP);
close(OUT);
##########################


Thanks,
Rex


----------------------------------------------------------------------------
--------------------
The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to