I am running out of memory when I try to select about 100K rows. Here is
some code:

my $cel_sth = $dbh_from->prepare($sqlstring);
$cel_sth->execute();
my @rowvalues;
my $rv;
my $elementresult_id;
my $element_id;
my $quantification_id;
my $signal;

$rv = $cel_sth->bind_columns(\($elementresult_id, $element_id,
$quantification_id, $signal));

while($cel_sth->fetch()){
 @rowvalues = ($elementresult_id, $element_id, $quantification_id,
$signal);
 my $printstring= "insert into $table_name values(";
 foreach my $column (@rowvalues){
   if (defined $column){
     if($column =~ /\D/){
       $column = "'".$column."'"
         unless ($column =~ /\d*\.\d+/);
     }
     $printstring.= "$column,";
   } else {
     $printstring.= "NULL,";
   }
 }
 $printstring =~ s/,$/);/;
 print "$printstring\n";
}

I guess the problem is it tries to hold all the rows in memory. Is there a
way to just get 1 or a few rows at a time?
--
Lou Fridkis
Human Genetics
57920

Reply via email to