Re: memory issues?

2007-01-19 Thread Bertrand Baesjou
Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line (INFILE) { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn, not very awake today I think. I also left an old subject line in

Re: memory issues?

2007-01-19 Thread Octavian Rasnita
From: Bertrand Baesjou [EMAIL PROTECTED] Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line (INFILE) { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn, not very awake today I

Re: memory issues?

2007-01-19 Thread Bertrand Baesjou
Octavian Rasnita wrote: From: Bertrand Baesjou [EMAIL PROTECTED] Paul Johnson wrote: On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote: foreach $line (INFILE) { See, this isn't a while loop, as you have in the subject. That is the cause of your problems. Damn,

Re: memory issues?

2007-01-19 Thread Xavier Noria
On Jan 19, 2007, at 5:53 PM, Bertrand Baesjou wrote: Thank you very much, this is indeed the solution. The explanation is that when you process lines this way foreach my $line (FH) { ... } the readline operator is evaluated in list context and, thus, the file is slurped into a single

memory issues reading large files

2002-02-07 Thread Brian Hayes
Hello all. I need to read through a large (150 MB) text file line by line. Does anyone know how to do this without my process swelling to 300 megs? I have not been following the list, so sorry if this question has recently come up. I did not find it answered in the archives. Thanks, Brian

Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy
On Thu, 7 Feb 2002, Brian Hayes wrote: Hello all. I need to read through a large (150 MB) text file line by line. Does anyone know how to do this without my process swelling to 300 megs? As long as you aren't reading that file into an array (which would be a foolish thing to do, IMHO), I

Re: memory issues reading large files

2002-02-07 Thread Brian Hayes
You should be using something like open(FILE, $file) or die $!\n; while(FILE){ ## do something } close FILE; __END__ This is what I am doing, but before any of the file is processed, the whole text file is moved into memory. The only solution I can think of is to break

Re: memory issues reading large files

2002-02-07 Thread Brian Hayes
It appears the problem was using the foreach statement instead of while. I have not tested this extensively, but using foreach the whole text file (or output of pipe) is read into memory before continuing, but using while (and probably for) each line is processed as it is read. Thanks for all

Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy
On Thu, 7 Feb 2002, Brian Hayes wrote: It appears the problem was using the foreach statement instead of while. I have not tested this extensively, but using foreach the whole text file (or output of pipe) is read into memory before continuing, but using while (and probably for) each line is