P Kishor-3 wrote:
> 
> On Tue, Mar 17, 2009 at 6:44 AM, baxy77bax <[email protected]> wrote:
>>
>> hi
>>
>> i need help with this one.
>>
>> i have this perl script that goes something like this:
>>
>> my $fetchrow_stmt;
>>
>> sub _fetchrow_stmt {
>>
>>  my ($self,%arg) = @_;
>>  my $stm = "select * from $arg{table}";
>>  $fetchrow_stmt = $dbh->prepare($stm) || die $dbh->errstr; ;
>>  $fetchrow_stmt->execute || die $dbh->errstr;
>>
>> }
>>
>>  sub _return_row {
>>
>> my ($self,%arg) =...@_;
>> return $fetchrow_stmt->fetchrow_arrayref();
>>
>>
>>  }
>>
>> sub _finish_stmt {
>>
>>  my ($self,%arg) = @_;
>>
>>  $fetchrow_stmt->finish();
>>
>> }
>>
>> the thing is that it's using my memory like crasy, and the source of this
>> behaviour (I THINK/not sure) is in buffering the query results from
>> sqlite.
>> so is there a way to limit that, so that in query results there are only
>> 2
>> results max at a time (not the whole table)
> 
> ufff! I would try
> 
> sub _fetchrow_stmt {
> 
>  my ($self,%arg) = @_;
>  my $stm = "select * from $arg{table}";
>  my $fetchrow_stmt = $dbh->prepare($stm) || die $dbh->errstr; ;
>  $fetchrow_stmt->execute || die $dbh->errstr;
>  while (my $row = $fetchrow_stmt->fetchrow_arrayref) {
>       # do something with the $row here
>  }
> }
> 
> instead of all the other cruft you have which makes for potential
> problems. In any case, fetchrow_arrayref, as the name says, does not
> fetch the whole table. It fetches the next row as a ref to an array.
> There is no way that can use too much memory... it is simply fetching
> a pointer to an array.
> 
> 
> 
> 
> 
> 
> -- 
> Puneet Kishor http://www.punkish.org/
> Nelson Institute for Environmental Studies http://www.nelson.wisc.edu/
> Carbon Model http://carbonmodel.org/
> Open Source Geospatial Foundation http://www.osgeo.org/
> Sent from: Madison WI United States.
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> 
> 


ok the curves are there for reason a reason, but just to make sure i redone
everything as you suggested and the memory usage , as you can see

"  25   0  954m 839m 2540 S   97  5.2   0:31.00 " is still to high ~ 840MB "
and this is for pure printout:

sub _fetchrow_stmt {

 my ($self,%arg) = @_;
 my $stm = "select * from $arg{table}";
 my $fetchrow = $dbh->prepare($stm) || die $dbh->errstr; ;
 $fetchrow->execute || die $dbh->errstr;
 open(OUT, ">", "van.txt");
 while (my $row = $fetchrow->fetchrow_array) {  
  print OUT "$row->[0],$row->[1]\n$row->[2]\n";
 }
 close OUT;
} 


does use of transactions maybe hold something in memory ??

thank you !
-- 
View this message in context: 
http://www.nabble.com/control-buffering-of-query-results-tp22557409p22564307.html
Sent from the SQLite mailing list archive at Nabble.com.

_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to