On Tue, Mar 17, 2009 at 6:44 AM, baxy77bax <[email protected]> wrote:
>
> hi
>
> i need help with this one.
>
> i have this perl script that goes something like this:
>
> my $fetchrow_stmt;
>
> sub _fetchrow_stmt {
>
>  my ($self,%arg) = @_;
>  my $stm = "select * from $arg{table}";
>  $fetchrow_stmt = $dbh->prepare($stm) || die $dbh->errstr; ;
>  $fetchrow_stmt->execute || die $dbh->errstr;
>
> }
>
>  sub _return_row {
>
> my ($self,%arg) =...@_;
> return $fetchrow_stmt->fetchrow_arrayref();
>
>
>  }
>
> sub _finish_stmt {
>
>  my ($self,%arg) = @_;
>
>  $fetchrow_stmt->finish();
>
> }
>
> the thing is that it's using my memory like crasy, and the source of this
> behaviour (I THINK/not sure) is in buffering the query results from sqlite.
> so is there a way to limit that, so that in query results there are only 2
> results max at a time (not the whole table)

ufff! I would try

sub _fetchrow_stmt {

 my ($self,%arg) = @_;
 my $stm = "select * from $arg{table}";
 my $fetchrow_stmt = $dbh->prepare($stm) || die $dbh->errstr; ;
 $fetchrow_stmt->execute || die $dbh->errstr;
 while (my $row = $fetchrow_stmt->fetchrow_arrayref) {
        # do something with the $row here
 }
}

instead of all the other cruft you have which makes for potential
problems. In any case, fetchrow_arrayref, as the name says, does not
fetch the whole table. It fetches the next row as a ref to an array.
There is no way that can use too much memory... it is simply fetching
a pointer to an array.






-- 
Puneet Kishor http://www.punkish.org/
Nelson Institute for Environmental Studies http://www.nelson.wisc.edu/
Carbon Model http://carbonmodel.org/
Open Source Geospatial Foundation http://www.osgeo.org/
Sent from: Madison WI United States.
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to