On Tue, Aug 13, 2002 at 06:32:45PM -0400, [EMAIL PROTECTED] wrote:
> > 
> > Any thoughts on using the new DBI feature, specifying maxrows in a
> > fetchall_arrayref?
> > 
> > If I use something like
> > 
> >  while ( my $array_ref=$sth->fetchall_arrayref(undef, $maxfetch)) {
> >  # ... do something
> >  }
> 
> See example from Tim's advanced DBI talk, last item.

(No excuse for not quoting search.cpan.org URLs now :)

  http://search.cpan.org/src/TIMB/DBI_AdvancedTalk_2002/sld020.htm

but I think the same problem applies to that.

Umm, I think I'll change fetchall_arrayref so it silently returns
undef when called on an inactive handle if a batch size is given.
[later: done, will be in the next release]

Meanwhile, try this one (untested):

  while( my $row = shift(@$rows)
      || ($sth->{Active} && shift(@{$rows=$sth->fetchall_arrayref(undef, 
$maxfetch)||[]}))
  ) {
    ...
  }

(The '||' not 'or' and 'and' not '&&' are important here.)

> > [BTW: Using this method, I get back 100,000 rows a time from a very large
> > Oracle table, at 6 seconds a fetch - that's quick!]

:-)

Tim.

p.s. The fetchall_arrayref docs for the next release will say:

    If $max_rows is defined and greater than or equal to zero then it
    is used to limit the number of rows fetched before returning.
    fetchall_arrayref() can then be called again to fetch more rows.
    This is especially useful when you need the better performance of
    fetchall_arrayref() but don't have enough memory to fetch and return
    all the rows in one go. Here's an example:
     
      my $rows = []; # cache for batches of rows
      while( my $row = ( shift(@$rows) || # get row from cache, or reload cache:
                         shift(@{$rows=$sth->fetchall_arrayref(undef,10_000)||[]) )
      ) {
        ...
      }
     
    That is the fastest way to fetch and process lots of rows using the DBI.

I've not bothered mentioning the 

Reply via email to