Ron, Thanks again.
TIm, I am using psql.
On 12/30/06, Tim Bunce <[EMAIL PROTECTED]> wrote:
Assuming you're using mysql, take a look at the mysql_use_result
attribute.
Tim.
On Thu, Dec 28, 2006 at 01:48:28PM -0800, louis fridkis wrote:
> I am running out of memory when I try to select about 100K rows. Here is
> some code:
>
> my $cel_sth = $dbh_from->prepare($sqlstring);
> $cel_sth->execute();
> my @rowvalues;
> my $rv;
> my $elementresult_id;
> my $element_id;
> my $quantification_id;
> my $signal;
>
> $rv = $cel_sth->bind_columns(\($elementresult_id, $element_id,
> $quantification_id, $signal));
>
> while($cel_sth->fetch()){
> @rowvalues = ($elementresult_id, $element_id, $quantification_id,
> $signal);
> my $printstring= "insert into $table_name values(";
> foreach my $column (@rowvalues){
> if (defined $column){
> if($column =~ /\D/){
> $column = "'".$column."'"
> unless ($column =~ /\d*\.\d+/);
> }
> $printstring.= "$column,";
> } else {
> $printstring.= "NULL,";
> }
> }
> $printstring =~ s/,$/);/;
> print "$printstring\n";
> }
>
> I guess the problem is it tries to hold all the rows in memory. Is there
a
> way to just get 1 or a few rows at a time?
> --
> Lou Fridkis
> Human Genetics
> 57920
--
Lou Fridkis
Human Genetics
57920