Hello all,

I have a few files that are on average 30MB and need to be processed through
a perl script.  The script ends up taking almost and hour.  Thing is, the
script cannot run more than an hour cause another script is kicked off (same
script different data every hour).

What is the best way to read the data in and process it?  Read each line of
the file into an array and then process each line or just process the line
directly?

Anyone have any scripts that process large files quickly?  I'd love to see
examples of how you did it.

Thanks,
Kevin



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to