--- Perrin Harkins <[EMAIL PROTECTED]> wrote:

> > sub parse {
> >   my ($class,$file) = @_;
> >   my @data;
> >   open my $F, $file or die $!;
> >   while ( my $line = <$F> ) {
> >     my @fields = split /=/, $line;
> >     push @data, [EMAIL PROTECTED];
> >   }
> >   close $F;
> >   return [EMAIL PROTECTED];
> > }
> 
> If you read enough data into @data to use up 20MB,
> it will stay that
> size.  That's a good thing if you intend to read
> another file of
> similar size on the next request.  This would only
> be bad if you read
> a very large amount of data in but only now and
> then.
> 
> The best way to avoid this kind of problem is to not
> read the whole
> thing into RAM.  You can pass an iterator object to
> TT instead of
> loading all the data at once.

how do you do that? I have to parse the file to get
the  data first. do you mean parsing the file during
each iteration and get the next availabe result then
return it to TT?

hypothetically in a database env, that would mean hit
the database with fetch_next_row for every iteration.
right?
 
> - Perrin
> 

James.

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 

Reply via email to